Wednesday, September 08, 2010

Raw g and AI

You can probably guess why I liked this paragraph, from a job description for a Research Fellow at the Singularity Institute.

So what does it take to get that job done? Well, for starters, sheer raw fluid intelligence, plain old-fashioned Spearman's g. You'll need to know things that aren't in textbooks and apply skills that aren't taught in classes. You'll have to pick things up rapidly, from a few hints, without them being hammered into you. I attended the inaugural symposium of the Redwood Center for Theoretical Neuroscience, and they asked a panel of prestigious experimental neuroscientists what kind of experience they'd most like to see in a hiree. And one said "Neuroscience", and one said "Electrical engineering", and then one said, "I'd rather hire a physicist, because they can learn anything," and the rest all nodded. That's the indispensable quality we're looking for, whether it appears in a physicist or not.

Why I think AI is very hard:

... evolution has compressed a huge amount of information in the structure of our brains (and genes), a process that AI would have to somehow replicate. A very crude estimate of the amount of computational power used by nature in this process leads to a pessimistic prognosis for AI even if one is willing to extrapolate Moore's Law well into the future. Most naive analyses of AI and computational power only ask what is required to simulate a human brain, but do not ask what is required to evolve one. I would guess that our best hope is to cheat by using what nature has already given us -- emulating the human brain as much as possible.

No comments:

Blog Archive