Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Artificial Intelligence

Name: Anonymous 2010-01-22 14:53

Whenever people talk about artificial intelligence, all they can say is that "it's possible" but "it's not there yet." In this thread, I propose we discuss our various philosophies and ideas for implementation of the various forms of artificial intelligence. After all, it's an interesting topic that deserves more than a wave of the hand.

There are many types of theoretical artificial intelligences. Let us define our first type as that of a "superhuman." This type is designed to think in a manner similar to humans, but faster, with different kinds of interests, and the possibility of death significantly reduced. To create something like this, we need to look at the human brain.

The human brain experiences emotions. How it responds to these emotions is what defines us as intelligent creatures. Most people seem to be under the misconception that human emotions are something that can never be replicated in a machine; it's a series of chemical reactions, you can't induce that in a computer, they say. I, however, believe that this is incorrect, and not for the reason of "you can simulate the atoms in our universe and stick a brain in the simulation, and theoretically it will think just like a man."

Human emotions are actually more simple to understand than one might realize. Emotions are simply different levels of worrying. When you feel relaxed, you feel good because your mind isn't focused on anything important. When you feel accomplished, you feel good because your mind was focused on some important problem, but you are now secure in the knowledge that you don't have to worry about it anymore. The feeling of "working towards something" is a matter of knowing a problem and worrying about it, but being secure in the knowledge that you are doing something about it. Likewise, one feels frustration when they are trying to work towards something, but what they are doing isn't helping. One feels hollow when they have nothing good to worry about. The list goes on and on, but I think you get the point: emotions are defined as different levels of worrying.

Of course, you are all aware of the concept of the conscious and subconscious. The subconscious is simply a device that interprets a situation and sets a certain mood. The conscious then responds in a manner according to whatever the mood currently is. The important thing, however, is that the conscious has little to no knowledge of the workings of the subconscious; a black-box, if you will. The subconscious, on the other hand, understands everything about the conscious. It is this inability to ultimately control our emotions (and thus, our subconscious) that defines us as human.

Likewise, if you wish to create a man-like, self-centric artificial intelligence, you must simply give it:

1. The ability to rationalize and react
2. Different moods that affect the way it reacts
3. A device that controls these moods in a manner that is transparent to the intelligence itself

Once you give the intelligence these requirements, a real personality will spring up; after all, that which defines unique personalities is how the creature's subconscious responds to situations and what situations activate which moods.

When you look at the situation this way, the things standing between man and his child are not quite as abstract as commonly assumed to be.

Name: Anonymous 2010-01-23 11:34

>>27
They have those already; they're called ``single-player crpgs''.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List