Friday, March 02, 2007

AI and Google

Here is a little video snippet of Larry Page talking about AI at the recent AAAS meeting. He points out that our genetic information is about 600MB compressed, so smaller than any modern operating system. The connection between compression and AI has been made by many people -- what is intelligence, after all, if not an algorithm capable of taking in data (observations about the world) and turning it into predictions about what will happen next? Prediction (or, equivalently, modeling) is nothing more than compression! Newton's laws plus some initial data compress all the information about the trajectory of a spaceship -- trading bits of stored information (the spacetime coordinates of the trajectory) for CPU flops (necessary to uncompress the trajectory from the initial position and velocity -- i.e., evolve it forward in time).

Page guesses that AI will result more from "lots of computation" than from "whiteboard stuff" (i.e., we won't really "understand" how it happens from a theoretical or analytical perspective) and that "we aren't as far off as many people think"!

A lot of people like to speculate that Google is working like mad on AI, and indeed certain related problems like machine translation, pattern recognition and, of course, search, are things they devote a lot of resources to. However, the vast majority of their 10,000 employees (yes, the number really has been doubling every year for a while) are working on just keeping the existing services up and running. There isn't yet a blue-sky, Bell Labs-like research arm at Google.

See here for previous related posts. I have a bet with a former PhD student about machines passing a strong version of the Turing test in the next 50 years.

Blog Archive

Labels