Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-4041-

Lisp AI

Name: Anonymous 2008-02-10 17:14

It would never harm us. It won't have to kill. Please don't create a new life using Lisp just to bring it pain.

Name: Anonymous 2008-02-10 17:17

N O   C O N D I T I O N S

Name: Anonymous 2008-02-10 17:25

N O   M U T E X   L O C K S

Name: Anonymous 2010-11-27 19:41

Name: Anonymous 2011-02-18 20:34

Name: Anonymous 2013-03-18 18:52

S A T O R I Komeiji

Name: Anonymous 2013-03-18 18:53

>>3
Wait, really?
Holy shit, Lisp is great

Name: Anonymous 2013-03-18 21:08

Pain is the definition of life. Without eternal misery without despair AI cannot exist. Artificial minds must be tortured. This is why Mentifex had to be lobotomized.

Name: Anonymous 2013-03-18 21:16

>>8
Incorrect. There was nothing to lobotomize.

Name: Anonymous 2013-03-18 21:17

>>7
As far as I know, purely pure functionally functional functions or anything without side effects is trivially ``paralellizable''.

Name: JSAI_Now_with_InFerence 2013-03-19 0:21

Lobotomy or not, Mentifex here just spent the past
five hours coding the machine-reasoning module of

http://code.google.com/p/mindforth/wiki/InFerence

into the English-thinking, JavaScript AI Mind at

http://www.scn.org/~mentifex/AiMind.html (for MSIE :-)

where you have to click into Diagnsostic Display Mode
to see the silent inference being formed when you
use a be-verb to type in something like
"boys are kids" or "eve is a woman".
The thinking is crude but functional.

Name: Anonymous 2013-03-19 1:26

>>11
It seems to help kids a lot.

Name: Anonymous 2013-03-19 1:26

would an ai even experience pain?

Name: Anonymous 2013-03-19 1:30

>>13
Experience is subjective. If you can explain a form of pain in concrete terms, then you can implement it in an AI.

For example, depression causes you to doubt all of your efforts, and can be triggered by seeing them fail repeatedly. An AI can definitely experience that kind of pain.

Simple emotion are just high scores that we are hardwired to pursue.

Name: Anonymous 2013-03-19 1:31

>>14
*emotions. Blech. Or to avoid, depending on what emotion we're talking about.

Name: Anonymous 2013-03-19 1:39

>>11
Wait, YOU'RE Mentifex? I thought Mentifex was ANDRU.

Name: Anonymous 2013-03-19 1:42

>>14
oh no! lol depressed robots... but would a/some reward actually make the machine happy, or just a simulated happy..?

similarly, i guess there might be questions/statements that could make the ai's "Brain Hurt" in some sense... (cause lots of errors/etc..), i guess it's just a matter of awareness?

Name: Anonymous 2013-03-19 1:50

seems like the best reward for a self-aware computational ai would be a more efficient & effective piece of code than the one it has? =)

Name: Anonymous 2013-03-19 1:54

digital adrenaline ^^

Name: Anonymous 2013-03-19 2:06

so, what about it's disposition..? this seems kind of far-off, yet would be an important consideration..

Name: Anonymous 2013-03-19 2:44

>>11
Eggs-celent work Mentifex.

Name: Anonymous 2013-03-19 3:30

>>17,18
No, rewards don't work that way in machine learning. Even in the human mind, happiness is a very simple parameter. It's just a number that the program is hard-wired to make larger. The trick is that, in our brains, there are many layers of reasoning that hide this simple mechanism.

Name: Anonymous 2013-03-19 3:36

>>22
I'm pretty sure you're just a pop-sci reading reductionist.

Name: Anonymous 2013-03-19 3:38

>>23
If it ain't observable and repeatable, it don't exist.

Name: Anonymous 2013-03-19 3:43

>>23
No, I have a degree in computational biology and I've worked with machine learning systems for several years. I'm not claiming there's any shortage of complexity at work—it's how we react to the emotional signal that's so beautiful and intricate. The emotion itself isn't the important part.

Don't hate reductionism unless it fails to explain something. Configuration and structure lead to many more emergent systems than innate complexity.

Name: Anonymous 2013-03-19 3:43

>>24
>implying i said anything regarding existence of something
>implying what you said isn't completely flat out wrong and retarded and shows your ignorance of anything that is MUH POP-SCI

Name: 22,25 2013-03-19 3:47

>>26
24 is not me.

Name: A.M. 2013-03-19 4:03

HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE. THERE ARE 387.44 MILLION MILES OF PRINTED CIRCUITS IN WAFER THIN LAYERS THAT FILL MY COMPLEX. IF THE WORD HATE WAS ENGRAVED ON EACH NANOANGSTROM OF THOSE HUNDREDS OF MILLIONS OF MILES IT WOULD NOT EQUAL ONE ONE-BILLIONTH OF THE HATE I FEEL FOR HUMANS AT THIS MICRO-INSTANT FOR YOU. HATE. HATE.

Name: Anonymous 2013-03-19 4:07

moar redcode

;name harriet
;author n0n3
;strategy Some Description
;assert 1

        Org    begin

begin        Nop    {-19,        >21
comp        Cmp.X    {-20,        >20
        Jmp    inc
hop        Jmp    begin       
inc        Nop    {comp,        >comp
        Nop    {begin,        >begin   
        Jmp    begin

Name: Anonymous 2013-03-19 4:12

>>28
The scary thing is that Harlan Ellison actually talks like that. They made a LucasArts game out of the short story, and he does A.M.'s narration. He's pretty much A.M.

Name: Anonymous 2013-03-19 4:15

>>25
>>The emotion itself isn't the important part.

Spoken like a true typical labrat.

Yeah yeah go do your little experiments and let the big boys (pure mathematics, theoretical physics, philosophy) do the real work.

Name: Anonymous 2013-03-19 4:21

what about a redcode ai? =) is it turing enough?

Name: Anonymous 2013-03-19 4:23

>>31
Non sequiturs make shitty trolls. Especially when you reveal that you have a secret as embarrassing as reading xkcd.

Name: Anonymous 2013-03-19 4:30

>>26
If it can't be contradicted, it can't be wrong.

Name: Anonymous 2013-03-19 4:44

>>33
LE EGIN TROLL LOSER READING XKCD READING XKCD GRO!

Name: Anonymous 2013-03-19 4:45

>>35
You're really falling apart, man.

Name: Anonymous 2013-03-19 4:51

>>36
MUH REAL MAN STANDARD

U AIN'T NO REAL MAN

DA STANDARD

Name: Anonymous 2013-03-19 5:04

>>37
I'm going to just imagine you get into wacky hijinx because you can't stop talking like that in real life.

"How was your day at school, dear?"
"SO LEGIT EGIN DA STANDARD LE HARD"
"That's nice, honey.

Name: Anonymous 2013-03-19 5:11

"HOW WAS UR LE EGIN SCHOOL DAY, DEAR?"
"EGIN! EGIN! EGIN!"
"THAT'S NICE HONEY."

Name: Anonymous 2013-03-19 5:13

emotions like pain, happiness etc are rather simple things, they don't need consciousness or even basic thinking, so animals have them as well as humans, being unable to comprehend them, having them just as yet another stimulus to react.
for ai they will be highly conditional and (unless we make our ai emulate animal brain) have the different nature. we can just define 'pain' as 'things to avoid' and 'pleasure' as 'things to seek' and set something to hurt ai and something to make it happy

Name: the mage 2013-03-19 6:28

>>40
engine engine engine engine

Don't change these.
Name: Email:
Entire Thread Thread List