>>7
Depends on one's definition of Strong AI. What is yours?
I do believe that we'll achieve human-level intelligence AIs within 50 years, or in much less time than that. If not by anything novel, at least by emulating high-level structures in the human brain and integrating them with less "natural" technologies. The human brain is not as much of a mystery as people make it, however there are still plenty of unanswered questions, but that doesn't mean we'll need to know everything to achieve human-level intelligence AIs. Such AIs may be flawed in the way that they are hardly perfect and will be as flawed as humans can be, but with much higher potentials for learning capacity.
If your definition of Strong AI means general AI which does not use heuristics to solve problems (it uses exhaustive approaches), then that is indeed a pipe dream due to the amount of resources required(time and memory).
I don't see a reason why humans won't be able to achieve AT LEAST human-level intelligence AIs. If you see one, care to elaborate?