Pessimism of the Intellect, Optimism of the Will     Archive   Favorite posts   Twitter: @steve_hsu

Wednesday, August 31, 2011

Epistasis vs additivity

Continuing the discussion from my previous post: strong interactions at the level of individual genes do not preclude a linear (additive) analysis of population variation and natural selection.

On epistasis: why it is unimportant in polygenic directional selection

[Phil. Trans. R. Soc. B (2010) 365, 1241–1244 doi:10.1098/rstb.2009.0275]

James F. Crow*
Genetics Laboratory, University of Wisconsin, Madison, WI 53706, USA

There is a difference in viewpoint of developmental and evo-devo geneticists versus breeders and students of quantitative evolution. The former are interested in understanding the developmental process; the emphasis is on identifying genes and studying their action and interaction. Typically, the genes have individually large effects and usually show substantial dominance and epistasis. The latter group are interested in quantitative phenotypes rather than individual genes. Quantitative traits are typically determined by many genes, usually with little dominance or epistasis. Furthermore, epistatic variance has minimum effect, since the selected population soon arrives at a state in which the rate of change is given by the additive variance or covariance. Thus, the breeder’s custom of ignoring epistasis usually gives a more accurate prediction than if epistatic variance were included in the formulae.

Why did Crow have to write this 2010 paper? Don't evo-devo folks understand population genetics? Why do they find the dominance of additive heritability to be so counter-intuitive? Which of the two groups of scientists has a better understanding of how evolution works? Evo-devo folks seem to be from the traditional "revel in complexity" branch of biology: perfectly happy to find that living creatures are too complicated to be modeled by equations. (But are they?)

Some excerpts from the paper:

... Recent years have seen an increased emphasis on epistasis (e.g. Wolf et al. 2000; Carlborg & Haley 2004). Students of development and evo-devo, as well as some human geneticists, have paid particular interest to interactions. For those in these fields, epistasis is an interesting phenomenon on its own and studying it gives deeper insights into developmental and evolutionary processes. Ultimately one wants to know which individual genes are involved, and if one is studying the effects of such genes, it is natural to con- sider the ways in which they interact. Historically, among many other uses, epistasis has provided a means for identifying steps in biochemical and developmental sequences. More generally, including epistasis is part of the description of gene effects. So epistasis, despite methodological challenges, is usually welcomed as providing further insights. Students of development or evo-devo typically study genes of major effect. Of course, genes with major effects are more easily discovered, so they may be providing a biased sample. But we can say that at least some of the genes involved have large effects. And such genes typically show considerable dominance and epistasis.

In contrast, animal and plant breeders have traditionally regarded epistasis as a nuisance, akin to noise in impeding or obscuring the progress of selection. It may seem surprising that the traditional practice of ignoring epistasis has not led to errors in prediction equations. Why? It is this seeming paradox that I wish to discuss.

Continuously distributed quantitative traits typically depend on a large number of factors, each making a small contribution to the quantitative measurement. In general, the smaller the effects, the more nearly additive they are. Experimental evidence for this is abundant. This is expected for reasons analogous to those for which taking only the first term of a Taylor series provides a good estimate. ...

The most extensive selection experiment, at least the one that has continued for the longest time, is the selection for oil and protein content in maize (Dudley 2007). These experiments began near the end of the nineteenth century and still continue; there are now more than 100 generations of selection. Remarkably, selection for high oil content and similarly, but less strikingly, selection for high protein, continue to make progress. There seems to be no diminishing of selectable variance in the population. The effect of selection is enormous: the difference in oil content between the high and low selected strains is some 32 times the original standard deviation.

... Students of development, evo-devo and human genetics often place great emphasis on epistasis. Usually they are identifying individual genes, and naturally the interactions among these are of the very essence of understanding. The individual gene effects are usually large enough for considerable epistasis to be expected.

Quantitative genetics has a contrasting view. The foregoing analysis shows that, under typical conditions, the rate of change under selection is given by the additive genetic variance or covariance. Any attempt to include epistatic terms in prediction formulae is likely to do more harm than good. Animal and plant breeders who ignored epistasis, for whatever reasons, good or bad, were nevertheless on the right track. And prediction formulae based on simple heritability measurements are appropriate.

The power of using microscopic knowledge (genes) to develop macroscopic theory (phenotypes), whereby phenotypic measurements are used to develop prediction formulae, is beautifully illustrated by quantitative genetics theory.

Can we understand evolution without mathematics? Two more useful references:

Statistical Mechanics and the Evolution of Polygenic Quantitative Traits

The Evolution of Multilocus Systems Under Weak Selection



Note I am at BGI right now so there may be some latency in communication.

Monday, August 29, 2011

Footnotes and citations

Two important points from my talk on cognitive genomics, with references.

1. Most of the genetic variation in intelligence is additive. This may be confusing to those infected by the epigenetics revolution meme. Yes, epigenetics is important, but fortunately for us linear effects still dominate the population variation* of quantitative traits. As any engineer or physicist can attest, linearity is our best friend :-)

Data and Theory Point to Mainly Additive Genetic Variance for Complex Traits (PLoS Genetics)

The relative proportion of additive and non-additive variation for complex traits is important in evolutionary biology, medicine, and agriculture. We address a long-standing controversy and paradox about the contribution of non-additive genetic variation, namely that knowledge about biological pathways and gene networks imply that epistasis is important. Yet empirical data across a range of traits and species imply that most genetic variance is additive. We evaluate the evidence from empirical studies of genetic variance components and find that additive variance typically accounts for over half, and often close to 100%, of the total genetic variance. We present new theoretical results, based upon the distribution of allele frequencies under neutral and other population genetic models, that show why this is the case even if there are non-additive effects at the level of gene action. We conclude that interactions at the level of genes are not likely to generate much interaction at the level of variance. [italics mine]

* See comments for more discussion!


2. A conservative estimate is that a million or so people will get full sequencing in the next 5-10 years. This would cost $1 billion at $1k per genome (almost doable today), which is roughly what the original human genome project cost. I'd guess at least several times that number will get SNP genotyped in the next 5 years. My estimates will seem laughably conservative if recent sequencing price trends continue. The paper below suggests that (e.g., Table 2), given the appropriate phenotype data, sample sizes of a million should be enough to capture most of the genetic variance for traits like height. I expect intelligence to be similar.

Estimation of effect size distribution from genome-wide association studies and implications for future discoveries (Nature Genetics)

We report a set of tools to estimate the number of susceptibility loci and the distribution of their effect sizes for a trait on the basis of discoveries from existing genome-wide association studies (GWASs). We propose statistical power calculations for future GWASs using estimated distributions of effect sizes. Using reported GWAS findings for height, Crohn's disease and breast, prostate and colorectal (BPC) cancers, we determine that each of these traits is likely to harbor additional loci within the spectrum of low-penetrance common variants. These loci, which can be identified from sufficiently powerful GWASs, together could explain at least 15–20% of the known heritability of these traits. However, for BPC cancers, which have modest familial aggregation, our analysis suggests that risk models based on common variants alone will have modest discriminatory power (63.5% area under curve), even with new discoveries.

I'm off to BGI tomorrow ...

Saturday, August 27, 2011

Gattaca

I just saw this for the first time in HD. It's visually quite stunning.





Embryo selection, but no additional engineering:

Geneticist (Blair Underwood): Keep in mind, this child is still you -- simply the best of you. You could conceive naturally a thousand times and never get such a result ...
According to this discussion, an offer of enhancement didn't make the final cut:
In an outtake to the movie, the geneticist states that for an extra $5,000 he could give the embryo enhanced musical or mathematical skills – essentially splicing in a gene that was not present on the parents’ original DNA.

Thursday, August 25, 2011

Video of Google talk on cognitive genomics



This is video of the talk I gave at Google.

I haven't watched it yet -- I'm worried that the audio is a bit patchy because I kept stepping away from the microphone on the podium. Probably not one of my best performances, but I think I got the main ideas across :-)

Pais: Pauli aspie?



From The Genius of Science, a portrait gallery of 20th century physicists by Abraham Pais.

So it came about that I met Pauli for the first time in Denmark, in early 1946, at a dinner party in Bohr's home. At that time he had already long been recognized as one of the major figures in 20th century physics ... I witnessed for the first time his chassidic mode, a gentle rhythmic to and fro rocking of the upper torso... "No, perhaps you don't know much, perhaps you don't know much." A moment later: "Ich weiss mehr" (I know more). That was said in the Pauli style, without aggression, merely an expression of a statement of fact.

More from Pais.

Tuesday, August 23, 2011

Paleo man

Dan and I were Junior Fellows at the same time (even in the same year, IIRC). I was never that interested in "bone-diggin-ology" but Dan was always a good person to talk to about the subject. I wonder if Dan follows a paleo diet.

NYTimes: Among his academic peers, Daniel Lieberman, 47, is known as a “hoof and mouth” man.

That’s because Dr. Lieberman, an evolutionary biology professor at Harvard, spends his time studying how the human head and foot have evolved over the millenniums. In January, Harvard University Press published his treatise, “The Evolution of the Human Head.” ...

Q. Why heads?

A. Our heads are what make our species interesting. If you were to meet a Neanderthal or a Homo erectus, you’d see that they are the same as us — except from the neck up. We’re different in our noses, ears, teeth, how we swallow and chew. When you think about what makes us human, it’s our big brains, complex thought and language. We speak with our heads, breathe and smell with our heads. So understanding how we got these heads is vital for knowing who we are and what we are doing on this planet.

Q. Are there any practical benefits to your research?

A. There are. A majority of the undergraduates who register for my evolutionary anatomy and physiology class here at Harvard are pre-medical students. Learning this will help them become better doctors. Many of the conditions they’ll be treating are rooted in the mismatch between the world we live in today and the Paleolithic bodies we’ve inherited.

For example, impacted wisdom teeth and malocclusions are very recent problems. They arise because we now process our food so much that we chew with little force. These interactions affect how our faces grow, which causes previously unknown dental problems. Hunter-gatherers — who live in ways similar to our ancestors — don’t have impacted wisdom teeth or cavities. There are many other conditions rooted in the mismatch — fallen arches, osteoporosis, cancer, myopia, diabetes and back trouble. So understanding evolutionary biology will definitely help my students when they become orthopedists, orthodontists and craniofacial surgeons. ...

Q. Do you run barefoot?

A. Only in the summer. Obviously, you cannot run barefoot in a New England winter! Then, I use a shoe that brings me more toward the barefoot style. It’s called a “minimal shoe,” and it’s more like a glove for the foot. Some people tell me it looks silly. But I like the way it feels. And I love running barefoot when I can. You get all this wonderful sensory pleasure from your feet. You feel the grass and the sensation of the earth. You get bathed by sensation. There are a lot of sensory nerves in the feet.

Right now, every sports gear company is now developing a line of these minimal shoes. One company, I should inform you, has helped fund some of my laboratory research, though I’ve not had anything to do with their product.

Q. Is your research part of a trend?

A. It’s part of this movement to try to listen to evolution in our bodies. We evolved to eat different diets, to run differently and live differently from the ways we do today. People are looking to evolution to find out how our bodies adapted and what might be healthier for us. That’s good.

Friday, August 19, 2011

Googleplex action photos

This is a photo from the talk I gave yesterday at Google to kick of our US intelligence GWAS. See www.cog-genomics.org for more information. If you're having any problems with the site, especially the participant registration, please let us know. We've received a great deal of feedback just in the last 48 hours about the study, so please be patient if you sent me a message.



There were a lot of good questions from the audience at the talk. I only had an hour so I had to go through the slides rather quickly. People seemed quite enthusiastic about the project :-)



Wednesday, August 17, 2011

@Google: Genetics and Intelligence

I'll be giving a talk at Google tomorrow (Thursday August 18) at 5 pm. The slides are here. The video will probably be available on Google's TechTalk channel on YouTube, perhaps after some delay.

The Cognitive Genomics Lab at BGI is using this talk to kick off the drive for US participants in our intelligence GWAS. More information at www.cog-genomics.org, including automatic qualifying standards for the study, which are set just above +3 SD. Participants will receive free genotyping and help with interpreting the results. (The functional part of the site should be live after August 18.)

Title: Genetics and Intelligence

Abstract: How do genes affect cognitive ability? I begin with a brief review of psychometric measurements of intelligence, introducing the idea of a "general factor" or IQ score. The main results concern the stability, validity (predictive power), and heritability of adult IQ. Next, I discuss ongoing Genome Wide Association Studies which investigate the genetic basis of intelligence. Due mainly to the rapidly decreasing cost of sequencing, it is likely that within the next 5-10 years we will identify genes which account for a significant fraction of total IQ variation.

We are currently seeking volunteers for a study of high cognitive ability. Participants will receive free genotyping.

Sunday, August 14, 2011

Clark, Cowen, DeLong discuss genetics and deep economic history



Tyler Cowen, Brad DeLong and Greg Clark discuss A Farewell to Alms in a seminar from three or four years ago. Thanks to Jason Collins for the link.

I liked Tyler's overview (@36min), which emphasizes that culture, genes and institutions all affect economic growth. He lists of order 10 factors (17!) that impact the industrial revolution, and notes that we have only one historical data point. (To be more accurate we probably have a few, but certainly not enough.) Therefore there must be many models consistent with the facts. (This is of course the fundamental problem for economics and social science, and why progress is so hard.) My review of Clark's book made some similar points. In noting that Clark's ideas are sometimes too simplistic, we should keep in mind he is a primarily an empiricist (economic historian) not a theoretician. He's the guy who pored over ancient British records to obtain demographic data.

DeLong discusses population genetics @31min. Jason recapitulates the argument. My comment on the post (typos corrected) is below.

Brad’s calculation is a bit unrealistic. Most traits are controlled by large numbers of genes and there is a huge amount of extant variation, even within families. No new mutations or special “patience genes" are required for evolutionary change.

A more realistic calculation, that takes into account heritabilities less than one, limited correlation between phenotype and reproduction rate, etc. leads to about 1000 years as the fastest timescale for shift in population mean of about one standard deviation. There has been plenty of time since the dawn of civilization or of agriculture for humans to have changed significantly. A few hundred years is probably not enough time except perhaps in some exceptional cases of really strong selection. I doubt Clark is right that the industrial revolution is primarily a consequence of selection driven pressure in England, although I agree that the British today are probably quite different from their ancestors one or a few thousand years ago.

http://infoproc.blogspot.com/2011/08/demography-and-fast-evolution.html

Apologies for typos — getting used to an iPad.

Clark responds to a question about Guns, Germs and Steel @59min. Pomeranz and Gunder Frank @1:03. Clark returns to more realistic genetic models @1:09. Clark's interest in the heritability of future time preference was stimulated by observations of his three children. Discussion heats up @1:13 -- Brad hews the extreme environmental line whereas Tyler is in the middle. @1:17 Brad reveals again some misunderstanding of population genetics.

Friday, August 12, 2011

I love Jack Kirby




His crude but expressive drawing style and mythic sensibilities made him unique among early comic artists.

Silver Surfer #18 was the prize of my grade school comic book collection. It's the issue in which the Surfer first encounters Black Bolt and the Inhumans. Richard Gere's Jesse Lujack is seen reading it in the remake of Breathless (Godard's À Bout de Souffle).


Svante Pääbo New Yorker profile

Very nice profile of Svante Pääbo in the New Yorker. (Subscription only.)

Pääbo's father was a Nobel laureate and I think the son has a good shot as well. What impresses me most is his creativity and willingness to take on difficult projects. Video of a 2008 lecture by Pääbo.

New Yorker: ... Svante Pääbo heads the evolutionary genetics department at the Max Planck Institute for Evolutionary Anthropology, in Leipzig, Germany. At any given moment, he has at least half a dozen research efforts in progress, all attempting to solve the question of what defines us as human. Pääbo’s most ambitious project to date, which he has assembled an international consortium to assist him with, is an attempt to sequence the entire genome of the Neanderthal. The project is about halfway complete and has already yielded some unsettling results, including the news that modern humans, before doing in the Neanderthals, must have interbred with them. Once the Neanderthal genome is complete, scientists will be able to lay it gene by gene against the human genome, and see where they diverge. “I want to know what changed in fully modern humans, compared with Neanderthals, that made a difference,” Pääbo said. “What made it possible for us to build up these enormous societies, and spread around the globe.” Pääbo, who is now fifty-six, grew up in Stockholm, the product of a love affair between his mother and a married biochemist named Sune Bergström. From an early age, he was interested in old things. In the early nineteen-eighties, he was doing doctoral research on viruses when he began fantasizing about mummies. His paper on mummy DNA became the cover article in Nature magazine. Pääbo moved to the University of California at Berkeley and, later, became a professor at the University of Munich. The first Neanderthal was found in a limestone cave about forty-five miles north of Bonn, in an area known as the Neander Valley. Describes the history of Neanderthal research. Mentions 454 Life Sciences. Toward the end of 2006, Pääbo and his team reported that they had succeeded in sequencing a million base pairs of the Neanderthal genome. But later analysis revealed that the million base pairs had probably been contaminated by human DNA. Pääbo’s research eventually showed that before modern humans “replaced” the Neanderthals, they had sex with them. The liaisons produced children, who helped to people Europe, Asia, and the New World. All non-Africans carry somewhere between one and four per cent Neanderthal DNA. From the archeological records, it’s inferred that Neanderthals evolved in Europe or Western Asia and spread out from there, stopping when they reached water or some other significant obstacle. This is one of the most basic ways modern humans differ from Neanderthals and, in Pääbo’s view, also one of the most intriguing. If the defining characteristic of modern humans is a sort of Faustian restlessness, or “madness,” then, by Pääbo’s account, there must be some sort of Faustian gene.

Thursday, August 11, 2011

A problem for data scientists

If flash mobs or riots (like the ones in London) are organized using Twitter, Facebook, BlackBerry and SMS, won't it be very easy to catch the people responsible? Not only are the organizers / initiators easy to track down, but with geolocation (GPS or cell tower) and a court order it would be easy to determine whether any particular individual had participated. Perhaps current privacy laws prevent that data from being stored, but we can easily modify the laws if necessary.

Where are those law enforcement data scientists when you need them? :-)

The rise of data science

See also this follow up article from O'Reilly Radar, and the earlier post Exuberant geeks.

What is data science: ... Data science requires skills ranging from traditional computer science to mathematics to art. Describing the data science group he put together at Facebook (possibly the first data science group at a consumer-oriented web property), Jeff Hammerbacher said:

"... on any given day, a team member could author a multistage processing pipeline in Python, design a hypothesis test, perform a regression analysis over data samples with R, design and implement an algorithm for some data-intensive product or service in Hadoop, or communicate the results of our analyses to other members of the organization."

Where do you find the people this versatile? According to DJ Patil, chief scientist at LinkedIn (@dpatil), the best data scientists tend to be "hard scientists," particularly physicists, rather than computer science majors. Physicists have a strong mathematical background, computing skills, and come from a discipline in which survival depends on getting the most from the data. They have to think about the big picture, the big problem. When you've just spent a lot of grant money generating data, you can't just throw the data out if it isn't as clean as you'd like. You have to make it tell its story. You need some creativity for when the story the data is telling isn't what you think it's telling.

... Entrepreneurship is another piece of the puzzle. Patil's first flippant answer to "what kind of person are you looking for when you hire a data scientist?" was "someone you would start a company with." That's an important insight: we're entering the era of products that are built on data. We don't yet know what those products are, but we do know that the winners will be the people, and the companies, that find those products. Hilary Mason came to the same conclusion. Her job as scientist at bit.ly is really to investigate the data that bit.ly is generating, and find out how to build interesting products from it. No one in the nascent data industry is trying to build the 2012 Nissan Stanza or Office 2015; they're all trying to find new products. In addition to being physicists, mathematicians, programmers, and artists, they're entrepreneurs.

Data scientists combine entrepreneurship with patience, the willingness to build data products incrementally, the ability to explore, and the ability to iterate over a solution. They are inherently interdiscplinary. They can tackle all aspects of a problem, from initial data collection and data conditioning to drawing conclusions. They can think outside the box to come up with new ways to view the problem, or to work with very broadly defined problems: "here's a lot of data, what can you make from it?"

The future belongs to the companies who figure out how to collect and use data successfully. Google, Amazon, Facebook, and LinkedIn have all tapped into their datastreams and made that the core of their success. They were the vanguard, but newer companies like bit.ly are following their path. Whether it's mining your personal biology, building maps from the shared experience of millions of travellers, or studying the URLs that people pass to others, the next generation of successful businesses will be built around data.

Here is a nice talk on machine learning and data science by Hilary Mason of bit.ly. One of my students will be working with her starting in the fall.

Tuesday, August 09, 2011

Demography and fast evolution

In an earlier post I discussed the population history uncovered by Gregory Clark in his book A Farewell to Alms. By examining British wills, he showed that the rich literally replaced (outreproduced) the poor over a period of several centuries.



The excerpt below is from a review of Greg Clark's book. The review is mostly negative about Clark's big picture conclusions, but does provide some interesting historical information. Note, the reviewer does not seem to understand population genetics (see discussion further below).

The comparison of Beijing nobility and Liaoning peasants is drawn from Lee and Wang’s (1999) survey of Chinese demography, which, in turn, is based on a very detailed investigation of population in Liaoning by Lee and Campbell (1997). In Liaoning, all men had military obligations and were enumerated in the so-called banner roles, which described their families in detail. Individuals’ occupations were also noted, so that fertility can be compared across occupational groups. High status, high income occupations had the most surviving sons: for instance, soldiers aged 46–50 had on average 2.57 surviving sons, artisans had 2.42 sons, and officials had 2.17 sons. In contrast, men aged 46–50 who were commoners had only 1.55 sons on average.

The references cited are

Lee, James Z., and Cameron D. Campbell. 1997. Fate and Fortune in Rural China: Social Organization and Population Behavior in Liaoning 1774–1873. Cambridge and New York: Cambridge University Press.

Lee, James Z., and Feng Wang. 1999. One Quarter of Humanity: Malthusian Mythology and Chinese Realities, 1700–2000. Cambridge and London: Harvard University Press.

So we have at least two documented cases of the descendants of the rich replacing the poor over an extended period of time. My guess is that this kind of population dynamics was quite common in the past. (Today we see the opposite pattern!) Could this type of natural selection lead to changes in quantitative, heritable traits over a relatively short period of time?

Consider the following simple model, where X is a heritable trait such as intelligence or conscientiousness or even height. Suppose that X has narrow sense heritability of one half. Divide the population into 3 groups:

Group 1 bottom 1/6 in X; < 1 SD below average
Group 2 middle 2/3 in X; between -1 and +1 SD
Group 3 highest 1/6 in X; > 1 SD above average

Suppose that Group 3 has a reproductive rate which is 10% higher than Group 2, whereas Group 1 reproduces at a 10% lower rate than Group 2. A relatively weak correlation between X and material wealth could produce this effect, given the demographic data above (the rich outreproduced the poor almost 2 to 1!). Now we can calculate the change in population mean for X over a single generation. In units of SDs, the mean changes by roughly 1/6 ( .1 + .1) 1/2 or about .02 SD. (I assumed assortative mating by group.) Thus it would take roughly 50 generations, or 1k years, under such conditions for the population to experience a 1 SD shift in X.

If you weaken the correlation between X and reproduction rate, or relax the assortative mating assumption, you get a longer timescale. But it's certainly plausible that 10,000 years is more than enough for this kind of evolution. For example, we might expect that the advent of agriculture over such timescales changed humans significantly from their previous hunter gatherer ancestors.

This model is overly simple, and the assumptions are speculative. Nevertheless, it addresses some deep questions about human evolution: How fast did it happen? How different are we from humans who lived a few or ten thousand years ago? Did different populations experience different selection pressures? Amazingly, we may be able to answer some of these questions in the near future.

Thanks to Henry Harpending for reminding me about the Chinese data and about the question of fastest plausible evolution for a quantitative trait.

Intelligence: heritable and polygenic

Below are the title and abstract of the paper I hinted at in this earlier post. Although the study failed to find any specific loci that are associated with intelligence, a global fit showed that a significant chunk of the heritability expected from twin and adoption studies is accounted for by SNPs. In other words, genetic similarity is correlated with similarity in g score, even though we don't know which genes are specifically responsible. The results of the study were expected from what we already knew: many genes of small effect, accounting for as much as .6 or so of narrow sense heritability. However, the technique is novel and its power will improve with larger sample sizes. The small minority of skeptics who doubt the validity of twin and adoption studies now have another kind of evidence to contend with.

Genome-wide association studies establish that human intelligence is highly heritable and polygenic

General intelligence is an important human quantitative trait that accounts for much of the variation in diverse cognitive abilities. Individual differences in intelligence are strongly associated with many important life outcomes, including educational and occupational attainments, income, health and lifespan. Data from twin and family studies are consistent with a high heritability of intelligence, but this inference has been controversial. We conducted a genome-wide analysis of 3511 unrelated adults with data on 549 692 single nucleotide polymorphisms (SNPs) and detailed phenotypes on cognitive traits. We estimate that 40% of the variation in crystallized-type intelligence and 51% of the variation in fluid-type intelligence between individuals is accounted for by linkage disequilibrium between genotyped common SNP markers and unknown causal variants. These estimates provide lower bounds for the narrow-sense heritability of the traits. We partitioned genetic variation on individual chromosomes and found that, on average, longer chromosomes explain more variation. Finally, using just SNP data we predicted ~1% of the variance of crystallized and fluid cognitive phenotypes in an independent sample (P=0.009 and 0.028, respectively). Our results unequivocally confirm that a substantial proportion of individual differences in human intelligence is due to genetic variation, and are consistent with many genes of small effects underlying the additive genetic influences on intelligence.

Friday, August 05, 2011

The sweet science



Bernard Hopkins shows Rashad Evans a little about the sweet science. I've never seen Rashad in such good shape, but I don't think Tito will be easy tomorrow. Bernard talks about the chess game inside boxing. Grappling and jiujitsu have it too: feints within feints within feints.

In a real fight Rashad would put Bernard to sleep.

UFC 133

Ditch Day



My Ditch Day stack was an honor stack: the underclassmen only got access to the room if they could accomplish the assigned task, to change the Hollywood sign to read CALTECH. Unfortunately, one of the students working on the stack did an interview with a local TV reporter, describing their plans. This reporter called the Hollywood police for comment, which led to the underclassmen being met at the sign by a police unit. The task was not carried out until the following academic year -- I received a postcard with the picture above after I had already started grad school :-)

Ditch Day

One of the oldest Caltech traditions is Ditch Day. Sometime during the 1920s, seniors longing for a break decided to give themselves a day off. Abandoning classes and schoolwork, they collectively vanished from campus.

This became an annual tradition, and eventually underclassmen (from whom the date of the event was kept secret until the day itself) took to "modifying" seniors' rooms while they were gone. Over the years, rooms have been filled with sand, Styrofoam, a disassembled-and-reassembled car, and a functioning cement mixer, among many other items; and furniture has been glued to ceilings, moved into courtyards, and suspended from trees.

Hoping to frustrate these modifications, seniors took to stacking cement blocks in front of their doorways before leaving campus. Over the years, these "stacks" evolved into complex, imaginative puzzles that are carefully planned out for months or even years in advance in order to occupy the underclassmen throughout the day.

Thursday, August 04, 2011

More from Hamming: ambiguity and commitment

Two paragraphs I neglected to quote in the previous post. See also Intellectual honesty.

A recent meme that's been circulating is that of the ideological Turing test. Until you can faithfully and convincingly emulate the severest opponents of your ideas, you have not yet thought about those ideas in a balanced and complete way.

You and Your Research: ... There's another trait on the side which I want to talk about; that trait is ambiguity. It took me a while to discover its importance. Most people like to believe something is or is not true. Great scientists tolerate ambiguity very well. They believe the theory enough to go ahead; they doubt it enough to notice the errors and faults so they can step forward and create the new replacement theory. If you believe too much you'll never notice the flaws; if you doubt too much you won't get started. It requires a lovely balance. But most great scientists are well aware of why their theories are true and they are also well aware of some slight misfits which don't quite fit and they don't forget it. Darwin writes in his autobiography that he found it necessary to write down every piece of evidence which appeared to contradict his beliefs because otherwise they would disappear from his mind. When you find apparent flaws you've got to be sensitive and keep track of those things, and keep an eye out for how they can be explained or how the theory can be changed to fit them. Those are often the great contributions. Great contributions are rarely done by adding another decimal place. It comes down to an emotional commitment. Most great scientists are completely committed to their problem. Those who don't become committed seldom produce outstanding, first-class work.

Now again, emotional commitment is not enough. It is a necessary condition apparently. And I think I can tell you the reason why. Everybody who has studied creativity is driven finally to saying, ``creativity comes out of your subconscious.'' Somehow, suddenly, there it is. It just appears. Well, we know very little about the subconscious; but one thing you are pretty well aware of is that your dreams also come out of your subconscious. And you're aware your dreams are, to a fair extent, a reworking of the experiences of the day. If you are deeply immersed and committed to a topic, day after day after day, your subconscious has nothing to do but work on your problem. And so you wake up one morning, or on some afternoon, and there's the answer. For those who don't get committed to their current problem, the subconscious goofs off on other things and doesn't produce the big result. So the way to manage yourself is that when you have a real important problem you don't let anything else get the center of your attention - you keep your thoughts on the problem. Keep your subconscious starved so it has to work on your problem, so you can sleep peacefully and get the answer in the morning, free.

Wednesday, August 03, 2011

Yukio Mishima

I was never enamored of his fictional writing (I've read more biographies of Mishima than novels by him), but Yukio Mishima lived one of the most fascinating lives of the mid 20th century, culminating in his suicide:
With a prepared manifesto and banner listing their demands, Mishima stepped onto the balcony to address the soldiers gathered below. His speech was intended to inspire a coup d'état restoring the powers of the emperor. He succeeded only in irritating them, however, and was mocked and jeered. He finished his planned speech after a few minutes, returned to the commandant's office and committed seppuku.
I was surprised to find this video with Mishima speaking English. I had thought his English rudimentary, but it's actually quite good.




Some classic photos.







Bicycle days

My daughter carefully assembles her outfit every morning and likes to draw pictures of dresses, shoes and accessories. My son could care less what he wears.



Monday, August 01, 2011

Predictive power of early childhood IQ

In the comments of this earlier post a father wondered to what extent one can predict adult IQ from measurements at age 5. The answer is that predictive power is fairly weak -- the correlation between a score obtained at 5 and the eventual adult score is probably no more than .5 or so. However, the main limitation seems to be unreliability of any single administration of the test to a child that young. Scores averaged over several administrations are a very good predictor already at a fairly young age. The average of three scores obtained at age 5, 6 and 7 correlates about .85 with adult score. This suggests that while it is difficult to measure a child's IQ in any single sitting, the IQ itself is relatively perdictable already by age 7 or so! Of course there are the usual caveats concerning range of environments, etc. I would like to see results from larger sample sizes.

From fig 4.7 in Eysenck's Structure and Measurement of Intelligence. This is using data in which the IQ was tested *three times* over the interval listed and the results averaged. A single measurement at age 5 would probably do worse than what is listed below. Unfortunately there are only 61 kids in the study.

age range       correlation with adult score

42,48,54 months               .55
5,6,7                               .85
8,9,10                             .87
11,12,13                          .95
14,15,16                          .95

The results do suggest that g is fixed pretty early and the challenge is actually in the measuring of it as opposed to secular changes that occur as the child grows up. That is consistent with the Fagan et al. paper cited above. But it doesn't remove the uncertainty that a parent has over the eventual IQ of their kid when he/she is only 5 years old.

Note added: I asked a psychometrician colleague about these results. He thought the correlations seemed a bit high. He looked up another study of 80 kids that appears in Bias In Mental Testing. They found a .7 correlation between scores at 7 and 17. If the score at 7 is noisy (looks like just a single measurement in this study) then the repeat measurement used above might raise the correlation slightly (e.g., by 10 percent?), so I think these results are not entirely inconsistent with each other. Also note I read the numbers above from a graph in a small figure, so there is some uncertainty in the values I reported.

Blog Archive

Labels

Web Statistics