Saturday, June 06, 2009

University rankings: research and faculty quality

University rankings are a complicated subject; see this Wikipedia entry for an overview of various methodologies. Two common problems with ranking algorithms are:

1. Reliance on reputational surveys -- reputation is a lagging indicator, and often inaccurate (see below).

2. Failure to normalize performance measures to size of institution (i.e., number of faculty). A bigger institution should, all things being equal, produce more research papers and grant dollars. Normalizing to size yields a better indicator of average quality.

The only analysis I could find which corrects for these problems appears in the book The Rise of American Research Universities, a detailed study by two academics, Hugh Davis Graham and Nancy Diamond. The authors construct 3 research productivity indices for the period 1980-1990. The natural and social science indices are computed by counting publications in top journals, while the humanities index is computed by counting awards from the National Endowment for Humanities, American Council of Learned Societies, Guggenheim Foundation, etc. (humanists tend to publish books rather than articles). Graham and Diamond normalize the scores to the number of faculty, providing a true per capita measure of average quality.

The following figure shows the combined results for public research universities. Note the divergence between reputation (right column, from NRC rankings of graduate programs) and actual per capita scholarly productivity. (Click for larger version.)

The natural science index result is also interesting (natural science includes math, engineering, computer science as well as physics, chemistry, biology, etc.). Caltech's score is 3.36, followed by Stanford 1.21, MIT 1.16, Harvard .93, Berkeley .92, Princeton .83. So far, according to reputation, but note UC San Diego is ranked 4th (just below MIT) at 1.07! UC Irvine (.56), UCSB (.53), UCLA (.51) and Colorado (.55) are all in the top 20, comparable to Ivies like Yale (.65), Cornell (.60) and Columbia (.49) and ahead of Big 10 powers Illinois (.45), Wisconsin (.42) and Michigan (.32). Hmm... according to these results Caltech researchers were 3 times as productive per capita (in these subjects, normalized to total faculty size) as their counterparts at MIT and Harvard, and 10 times as productive as those at Michigan. Part of this must be the relatively smaller social science and humanities departments, but that can't explain the whole effect.

Graham and Diamond's results establish that reputation is often a misleading indicator. They identify a number of "rising" public research universities -- often in the west (California, Colorado, Oregon) -- whose reputation rankings are not commensurate with their research quality. It reminds me of the way that traditional east coast sports media underrates Pac-10 football year after year. Sure, Ohio State and Michigan look great playing against other slow as mud teams, until they get to the Rose Bowl :-)


Ian Smith said...

Someone from Oregon must have made the list. When I was there which wasn't too long ago it was a dump.

Steve Hsu said...

The lead author is at Vanderbilt, and Diamond is at Vermont, I think.

It's odd you don't like the campus here in Eugene -- most people find it idyllic.

Unknown said...

I can't imagine anybody but a nut describing Oregon as a dump.

Anonymous said...


We have just added your latest post "University rankings: research and faculty quality" to our Directory of Science . You can check the inclusion of the post here . We are delighted to invite you to submit all your future posts to the directory and get a huge base of visitors to your website.

Warm Regards Team

Bruce Charlton said...

You might be interested by my attempts to look at 'revolutionary science' by analysis of Nobel and other prizes:

I did not correct for size of institutions - for some purposes this is important, but there are reasons not to do so in this instance.

Blog Archive