Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Artificial General Intelligence

Name: (λ λ λ) 2012-07-24 2:19

Many questions, few answers.

• Under what circumstances is a synapse between two neurons formed?  How does this effect vary with distance?  Since a synapse is generally unidirectional, what determines the direction?
• Under what circumstances is a negative-weight synapse between two neurons formed? (This one is far more difficult to answer.)
• In the early stages of human development, what aids the formation of the complex sensory input processing networks (e.g. vision)?  Is it the boosted number of synapses?  Is the structure of these purpose-specific networks actually encoded in the DNA or do they form naturally as a result of the sensory input?
• How does the number of neurons and synapses required to drive a sentient (artificial) brain vary with the amount of sensory input (e.g. how many neurons do you need for 128x128 (mapped like the human eye) stereoscopic colour vision)?
• How many neurons and synapses are required for the mastery of a natural language?
• What kind of (virtual) environment would be the most appropriate for a AGI to "grow" in?
• The human brain also uses chemical signalling for things like pain and emotion; are these things actually necessary for the development of a sentient being (this question is quite warranted since pain is a very quick path to self-awareness since it triggers self-preservation)?
• What is the simplest (computationally speaking) model that accounts for the complexity of the brain and approximates its functioning?

Name: Anonymous 2012-07-24 11:42

>>12

Everything that isn't instinctual is learned from experience. To have a program to be able to learn automatically, you need some sort of algedonic feedback loop, i.e. a judge or fitness function to decide if the program is doing well. But you probably already knew that. Strong AI is also just applied AI, it is just applied to do lots of things. Humans survive by forming concepts. The environment made us evolve to form concepts. The environment is humanity's judge. Sentience[1] is not necessary for survival/intelligence, it's just to introspect (you wouldn't be here if you couldn't, etc).

What you want is a program that can, by learning from experience, form concepts and abstract ideas, i.e. a program that behaves like a human. Good luck on that. No one really knows why the environment on Earth is like it is (suitable for sentient creatures that form concepts), but maybe you can figure it out.

1. ^ Here I assume by sentience you mean consciousness. The definitions of this word vary so much that even the definitions of the definitions will vary, so we will end up with conflicts, misunderstandings and disagreements.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List