>>33,37
Oh good. Someone's been drinking Hawkins' Kool-aid.
Pay attention to
>>36:
I wonder if your basic fallacy isn't thinking that if we slap together something complicated enough, it will work.
That's very close to Hawkins' fallacy, which is roughly: anything he doesn't understand is not important, (which is why he hasn't bothered to understand it.) He seems to blow off criticism on this point more than answer to it so I don't know what he really thinks.
The fact is the computational memory model doesn't do anything on its own. All this talk of what it can do is all in terms of application with external direction. For a self-directed entity with real higher-order function, something more complex is needed and Hawkins' mistake is easy to make: that complexity is in the form of the same stuff as the memory model. The difference is, unlike most cortical region matter, it is specialized.
The perception that cortical regions are not specialized (except by their location and thus what information they process) is important, but ignoring the fact that in actual intelligent brains declarative memory itself is managed by highly specialized non-cortical regions, and procedural memory is informed greatly by them as well. Behavior is part of that information, and in many cases the cortical processing is short-circuited by non-cortical regions.
Everything the cortex does somehow depends on more specialized regions. In the happy case
* that dependency is mostly in the past: the cortex does the bulk of the work by having learned how to do it. Since cortical regions are so general it's unlikely they contain the learning material (and if they did, that would cause rough times for the Numenta model.) Self-directed entities need to learn and all of the extant, recognizably intelligent ones come with learning material which we call instinct.
*: The "unhappy case" is when the cortex has been short-circuited or, more interestingly, cannot converge on a response with cortical processing. The information propagates out of the generalized matter and then what happens? Instinct plays a role here (but what if it doesn't apply?), but it seems that something more interesting happens, too. Whether that is really necessary for intelligence is up for debate.