>>31
We would have to start from scratch and it would not be enough to simply program the computer to recognise objects and have preset thoughts, it must be able to make categories itself and apply this skill to different inputs.
NO. A NEURAL NETWORK DOES THIS ON ITS OWN. GOD FUCKING DAMNIT.
Computers can ALREADY do this. People ALREADY use these sorts of artificial learning neural networks for all sorts of things, such as aiming routines in combat simulations. A 'category' is not something you artificially impose on a neural network. The network creates this on its own, and it is intertwined into the very structure of the network.
You are the idiot unfortunately. 1s and 0s obey quantum physics, but the computer always reads them as 1s and 0s and if they come 0.00001 seconds too early or too late the computer registers them as though they were neither early or late. The brain is continuous and fluid in time and the intensity of the messages being sent. This can be modelled using a computer program, but like trying to predict the weather with a computer it won'y be 100% precise.
NO! FUCKING KILL YOURSELF! JUST DO IT ANONYMOUS, END IT NOW AND STOP SPEWING BULLSHIT.
1s and 0s have nothing to do with it. You're not understanding that it doesn't matter how a computer computes things. A computer is equivalent to a turing machine, and a turing machine has nothing to do with binary. Stop talking about 1s and 0s because they're not relevant; no scientist ever thinks about binary when discussing algorithmic complexity.
Look at your own numbers. You say if a pulse comes 0.00001 seconds too late, a computer will screw it up? Okay, so a computer will misfire a simulated neuron 1 out of 100,000 neurons. This is where your lack of knowledge on neural networks really shines: THIS DOESN'T MATTER. Do you have any idea how much random shit is washing through your brain right now? Impurities in the air that you breathe probably cause this many misfirings, not to mention the effects of alcohol. Good lord.
What's more is that 0.0000001 is roughly the precision of a 32-bit float. With new 64-bit processors, it's actually faster for computers to compute things in 64-bit doubles, giving precision to within about 0.000000000000001. We already use this precision in artificial neural networks today.
Furthermore, quantum physics has nothing to do with it. The Planck length, at which quantum perturbations occur, is about 10^-35 meters. The diameter of the smallest neurons are about 10^-6 meters. That's thirty orders of fucking magnitude above the quantum scale. This is why when Roger Penrose starts talking about quantum effects in the brain, he gets called a quack by mainstream science:
"Penrose and Stuart Hameroff have constructed a theory in which human consciousness is the result of quantum gravity effects in microtubules. But Max Tegmark, in a paper in Physical Review E, calculated that the time scale of neuron firing and excitations in microtubules is slower than the decoherence time by a factor of at least 10,000,000,000. The reception of the paper is summed up by this statement in his support: "Physicists outside the fray, such as IBM's John Smolin, say the calculations confirm what they had suspected all along. 'We're not working with a brain that's near absolute zero. It's reasonably unlikely that the brain evolved quantum behavior', he says.""
Notice how I say we may this or that, never claiming that my speculations are anything but speculations. You are criticising my intelligence on the assumption that I said my ideas were fact.
No, I'm criticizing you because you're uneducated. You have NOT informed yourself about Turing machines and mechanical computation, and you have NOT informed yourself about what a neural network is, how it operates, and how we simulate it. You're talking straight out of your ass, pretending like you know something about artificial intelligence. Sorry, but you don't. Learn something first and then maybe we can have this discussion.