Saturday, October 24, 2009

Eric Baum: What is Thought?

Last week we had AI researcher and former physicist Eric Baum here as our colloquium speaker. (See here for an 11 minute video of a similar, but shorter, talk he gave at the 2008 Singularity Summit.)

Here's what I wrote about Baum and his book What is Thought back in 2008:

My favorite book on AI is Eric Baum's What is Thought? (Google books version). Baum (former theoretical physicist retooled as computer scientist) notes that evolution has compressed a huge amount of information in the structure of our brains (and genes), a process that AI would have to somehow replicate. A very crude estimate of the amount of computational power used by nature in this process leads to a pessimistic prognosis for AI even if one is willing to extrapolate Moore's Law well into the future. Most naive analyses of AI and computational power only ask what is required to simulate a human brain, but do not ask what is required to evolve one. I would guess that our best hope is to cheat by using what nature has already given us -- emulating the human brain as much as possible.

This perspective seems quite obvious now that I have kids -- their rate of learning about the world is clearly enhanced by pre-evolved capabilities. They're not generalized learning engines -- they're optimized to do things like recognize patterns (e.g., faces), use specific concepts (e.g., integers), communicate using language, etc.

What is Thought?

In What Is Thought? Eric Baum proposes a computational explanation of thought. Just as Erwin Schrodinger in his classic 1944 work What Is Life? argued ten years before the discovery of DNA that life must be explainable at a fundamental level by physics and chemistry, Baum contends that the present-day inability of computer science to explain thought and meaning is no reason to doubt there can be such an explanation. Baum argues that the complexity of mind is the outcome of evolution, which has built thought processes that act unlike the standard algorithms of computer science and that to understand the mind we need to understand these thought processes and the evolutionary process that produced them in computational terms.

Baum proposes that underlying mind is a complex but compact program that exploits the underlying structure of the world. He argues further that the mind is essentially programmed by DNA. We learn more rapidly than computer scientists have so far been able to explain because the DNA code has programmed the mind to deal only with meaningful possibilities. Thus the mind understands by exploiting semantics, or meaning, for the purposes of computation; constraints are built in so that although there are myriad possibilities, only a few make sense. Evolution discovered corresponding subroutines or shortcuts to speed up its processes and to construct creatures whose survival depends on making the right choice quickly. Baum argues that the structure and nature of thought, meaning, sensation, and consciousness therefore arise naturally from the evolution of programs that exploit the compact structure of the world.

When I first looked at What is Thought? I was under the impression that Baum's meaning, underlying structure and compact program were defined in terms of algorithmic complexity. However, it's more complicated than that. While Nature is governed by an algorithmically simple program (the Standard Model Hamiltonian can, after all, be written down on a single sheet of paper) a useful evolved program has to run in a reasonable amount of time, under resource (memory, CPU) constraints that Nature itself does not face. Compressible does not imply tractable -- all of physics might reduce to a compact Theory of Everything, but it probably won't be very useful for designing jet airplanes.

Useful programs have to be efficient in many ways -- algorithmically and computationally. So it's not a tautology that Nature is very compressible, therefore there must exist compact (useful) programs that exploit this compressibility. It's important that there are many intermediate levels of compression (i.e., description -- as in quarks vs molecules vs bulk solids vs people), and computationally effective programs to deal with those levels. I'm not sure what measure is used in computer science to encompass both algorithmic and computational complexity. Baum discusses something called minimum description length, but it's not clear to me exactly how the requirement of effective means of computation is formalized. In the language of physicists, Baum's compact (useful) programs are like effective field theories incorporating the relevant degrees of freedom for a certain problem -- they are not only a compressed model of the phenomena, but also allow simple computations.

Evolution has, using a tremendous amount of computational power, found these programs, and our best hope for AI is to exploit their existence to speed our progress. If Baum is correct, the future may be machine learning guided by human Mechanical Turk workers.

Baum has recently relocated to Berkeley to pursue a startup based on his ideas. (Ah, the excitement! I did the same in 2000 ...) His first project is to develop a world class Go program (no lack of ambition :-), with more practical applications down the road. Best of Luck!


Ian Smith said...

Whatever thought is Baum did none of it in his book.

Luke Lea said...

Here is a tougher one: what are feelings?

BwO said...

If nature is governed by an algorithmically simple program, then why did the need for algorithmically complex and computationally efficient programs evolve at all? Why did nature ever evolve constraints on itself?

Baum is just providing a modern gloss on the bankrupt AI idea of thought as search or compression in order to solve a problem. These guys never deal with the hard issue of how the problem gets specified to begin with, and so never deal with novelty and creativity in any fundamental way.

Seth said...

I enjoyed reading Baum's book after you posted about it many moons ago. He makes a number of sound observations, but as Clark just noted, they don't seem genuinely original ones nor are they likely to be sufficient to make a dramatic difference.

For example, the point Baum makes about subsystems knowing how to adapt to changes in other subsystems. How is the fact that muscles seek to attach to bones, or blood vessels seek cells signaling for oxygen really different from the way shapes in Visio know about their relationships to other shapes? Put another way, how is the modularity he suggests any different from Object Oriented programming, Aspect Oriented programming or (more esoterically) Charles Simonyi's notion of "intentional programming"?

But if Baum can come up with incrementally more productive development tools for software, that would be cool. Of course, right now the world economy prices software developers cheaply enough that there is no margin in better tools.

At a big picture level, I've long been persuaded that learning from evolution is an important part of creating "AI" (however defined). But I soon realized that this is already going on every day. Programmers are busy mutating code bases (often in a deleterious manner) and only harsh selection pressure from integration processes and QA keeps large software projects evolving in 'fitness enhancing' directions. The trick is to speed the whole process up. But nobody in academic life is interested in the traffic jam called software integration.

And, Clark, while it would be VERY nice to explain where the constraints came from, that's a loftier objective (maybe equivalent to "strong AI"). We could get a lot of mileage out of more modest progress in 'directed evolution' of software. And that progress might include a lot of rather dramatic advances in computers that 'walk and talk', so to speak, without quite being philosophically satisfactory "AI".

Carson C. Chow said...

"I'm not sure what measure is used in computer science to encompass both algorithmic and computational complexity."

By algorithmic complexity you linked to Kolmogorov complexity, which is the minimum description length of a string. Computational complexity refers to the asymptotic resources required to solve a problem, like the classes P, BPP, NP, PSPACE, etc.

The question I believe you are posing is how are the two related. The easy answer is that this is undecidable because the computation of Kolmogorov complexity is undecidable. The more useful answer is that they are certainly unrelated since using 140 characters, I can pose a problem in P, NP or undecidable.

BwO said...

STS is a more generous and reasonable soul than I, and I agree that if Baum can make some incrementally better programs, even without confronting the biggest questions -- well, that's probably a good place to start, and can be very useful.

I should also wait till the book appears from Amazon before passing judgment, instead of just watching the movie. Here as well I should defer to to STS and other more informed than I.

As a former physicist, I just get upset at the arrogance of a physicist/computer scientist who proposes to speak about "what thought is", pretty obviously without having ever cracked open a philosophy book (or even a history book -- doesn't anybody remember cybernetics?). Physics has a great deal to contribute to this debate, but people have been thinking about thinking for a *very* long time, and it pays to understand their perspective, even if they have attacked the question from a very different angle.

So I'm all for better software. Just please spare me the "this is thought" bit. Computers are the next step in thinking; let's not get carried away; we still don't know what thinking actually is.

Steve Hsu said...

Clark, you can read chunks of the book via the Google Books link on the original post. Have a look and then tell us what you think!

robin said...

thanks for sharing..

Blog Archive