Saturday, May 12, 2007

Gladwell and genius

Malcolm Gladwell shows exquisite taste in the subjects he writes and talks about -- he has a nose for great topics. I just wish his logical and analytical capabilities were better (see also here). This talk at the New Yorker's recent Genius 2012 conference is entertaining, but I disagree completely with his conclusion. Ribet, Wiles, Taniyama and Shimura are probably the real geniuses, not Michael Ventris, the guy who decoded Linear B. (Gladwell also can't seem to remember that it's the Taniyama-Shimura conjecture, not Tanimara. He says it incorrectly about 10 times.) My feeling is that Gladwell's work appeals most to people who can't quite understand what he is talking about.

Gladwell is confused about the exact topic discussed in James Gleick's book Genius. In a field where sampling of talents is sparse (e.g., decoding ancient codexes) you might find one giant (even an amateur like Michael Ventris) towering above the others, able to do things others cannot. In a well-developed, highly competitive field like modern mathematics, all the top players are "geniuses" in some sense (rare talents, one in a million), even though they don't stand out very much from each other. In Gleick's book, Feynman, discussing how long it might have taken to develop general relativity had Einstein not done it, says "We are not that much smarter than each other"!

To put it simply, if I sample sparsely from a Gaussian distribution, I might find a super-outlier in the resulting set. If I sample densely and have a high minimum cutoff for acceptable points, I will end up with a set entirely composed of outliers, but who do not stand out much from each other. Every guard in the NBA is an athletic freak of nature, even though they are relatively evenly matched when playing against each other.

To counteract the intelligence-damping effect of Gladwell's talk, I suggest this podcast interview with Nassim Taleb, about his new book The Black Swan. Warning: may be psychologically damaging to people who fool themselves and others about their ability to predict the behavior of nonlinear systems.

1 comment:

  1. To put it simply, if I sample sparsely from a Gaussian distribution, I might find a super-outlier in the resulting set. If I sample densely and have a high minimum cutoff for acceptable points, I will end up with a set entirely composed of outliers, but who do not stand out much from each other.

    IMHO a different scenario is worth mentioning.

    Suppose the probability distribution of "talent" has a sufficiently fat power-law tail above a cutoff. In that case, random samples above the cutoff will not clump near the cutoff. They will be scattered in an extended fashion.

    ReplyDelete