See also Defining Merit.
Brookings: The choice of whether and where to attend college is among the most important investment decisions individuals and families make, yet people know little about how institutions of higher learning compare along important dimensions of quality. This is especially true for the nearly 5,000 colleges granting credentials of two years or fewer, which together graduate nearly 2 million students annually, or about 39 percent of all postsecondary graduates. Moreover, popular rankings of college quality, such as those produced by U.S. News, Forbes, and Money, focus only on a small fraction of the nation’s four-year colleges and tend to reward highly selective institutions over those that contribute the most to student success.On the quality of PayScale compensation figures used in the analysis:
Drawing on a variety of government and private data sources, this report presents a provisional analysis of college value-added with respect to the economic success of the college’s graduates, measured by the incomes graduates earn, the occupations in which they work, and their loan repayment rates. This is not an attempt to measure how much alumni earnings increase compared to forgoing a postsecondary education. Rather, as defined here, a college’s value-added measures the difference between actual alumni outcomes (like salaries) and predicted outcomes for institutions with similar characteristics and students. Value-added, in this sense, captures the benefits that accrue from both measurable aspects of college quality, such as graduation rates and the market value of the skills a college teaches, as well as unmeasurable “x factors,” like exceptional leadership or teaching, that contribute to student success.
While imperfect, the value-added measures introduced here improve on conventional rankings in several ways. They are available for a much larger number of postsecondary institutions; they focus on the factors that best predict objectively measured student economic outcomes; and their goal is to isolate the effect colleges themselves have on those outcomes, above and beyond what students’ backgrounds would predict.
Using a variety of private and public data sources, this analysis finds that:
Graduates of some colleges enjoy much more economic success than their characteristics at time of admission would suggest. Colleges with high value-added in terms of alumni earnings include not only nationally recognized universities such as Cal Tech, MIT, and Stanford, but also less well-known institutions such as Rose-Hulman Institute of Technology in Indiana, Colgate in upstate New York, and Carleton College in Minnesota. Two-year colleges with high-value added scores include the New Hampshire Technical Institute, Lee College near Houston, and Pearl River Community College in Mississippi. ...
...There are a number of ways one can assess whether or not PayScale accurately captures the earnings of graduates—or whether the sample is statistically biased by the voluntary nature of its data collection.Effects of student ability and family SES:
Broadly, PayScale earnings by major for U.S. residents with bachelor’s degrees can be compared to similar data from the ACS, which annually samples 1 percent of the U.S. population.30 The correlation between the two is what matters most for this analysis, since value-added calculations are based on relative differences between predicted and actual earnings.
The correlation between bachelor’s degree holders on PayScale and median salaries by major for workers in the labor force from the Census Bureau is 0.85 across 158 majors matched between the two databases.
For each of the three post-attendance outcomes measured here—mid-career salary, loan repayment rate, and occupational earnings power—student test scores, math scores in particular, are highly correlated: 0.76 for mid-career salaries and 0.69 for student loan repayment and occupational earnings power (Figure 3). Other student characteristics, such as the percentage receiving Pell grants, also correlate highly with these outcomes, though not as highly as test scores.
Go Beavers! Note Caltech grads are also much more likely to win Nobel Prizes and be elected to the National Academy of Science or Engineering than graduates of any other university. Claims that a technical education bestows narrow technical skill without conveying deep ideas and teaching critical thinking are silly.
... Alumni from Cal Tech list the highest-value skills on their LinkedIn profiles (Table 3); their skills include algorithm development, machine learning, Python, C++, and startups (that is, starting a new business). Cal Tech is followed closely by Harvey Mudd and MIT. Babson College, also in the top 10, focuses on business rather than science; its course offerings teach many quantitative skills relevant for business-oriented STEM careers. Many graduates from the Air Force Academy are prepared for high-paying engineering jobs in the military and at large defense contractors. They list skills like aerospace and project planning.
Amusingly, you can see by looking at the tables of regression results that Asian share of enrollment is strongly positively correlated with mid-career earnings, whereas female share is negatively correlated. This is not surprising, given that STEM skills are big drivers of compensation.
Colgate's strong performance (and probably that of Carleton and some others) cannot be explained in terms of STEM skills -- see discussion and figure, on page 16.
a college’s value-added measures the difference between actual alumni
ReplyDeleteoutcomes (like salaries) and predicted outcomes for institutions with
similar characteristics and students.
I'd like to know how they establish that.
Note Caltech grads are also much more likely
to win Nobel Prizes and be elected to the National Academy of Science
or Engineering than graduates of any other university. So claiming that a
technical education bestows narrow technical skill without conveying
deep ideas and teaching critical thinking would be silly.
Sounds like you say something quite a bit different in the linked post. :)
Ivy League: ouch!
ReplyDeleteThe regression coefficients are available in the full report (52 pages) Appendix Table 4 (pp. 36-37). They also give some model information (e.g. adjusted R^2).
ReplyDeleteThe only (direct, not demographic) student quality metric I see is math admission scores as Z scores. Caltech shows as 3.59 and MIT as 3.50 which indicates a little resolving power at that level. My understanding is the post-recentering SAT is mean 500 with SD 110 which makes those numbers seem odd to me (my best guess is their SD is smaller since the admissions data--especially averages--should truncate the bottom of the distribution). I wonder why they omitted verbal scores.
If I interpret the model correctly the math score term is calculated as:
ysal = -0.0178*xmath + 0.0119*xmath^2
which seems really strange (parabola with min at 0.75). Can someone please double check my reasoning here?
They include an Excel spreadsheet (12MB) with all their data. A nice feature is they list the data sources which might be helpful for anyone who wants to do similar work,
After a closer look at the math score there are some data quality problems. Try sorting by that column. It ranges from -6 to 6. Diamond Beauty College higher than Caltech--I don't think so...
Some distribution information for that score:
Math
Min. :-6.0000
1st Qu.:-1.2000
Median :-0.7100
Mean :-0.5529
3rd Qu.:-0.1100
Max. : 6.0300
calculated SD: 0.95
I was a little surprised at the absence of HYP at the top given that they adjusted for STEM. Sorting the spreadsheet by actual/predicted salary numbers gives some interesting results. One thing that helps Caltech's standing is the predicted salary (average mid-career median) is $5k less than MIT. I wonder what drives that difference. The military academies do very well on actual salary (all three above Harvard?!), but for some reason do not have predictions so aren't ranked.
I wonder if HYP would have done better if measured by average salary.
Most of their data is from the Integrated Postsecondary Education Data System (IPEDS): https://nces.ed.gov/ipeds/
To be honest differences between the top few schools in this ranking are likely noise dominated. Do we really know mid-career median salary to within $5k for MIT vs Caltech? ...
ReplyDeleteThey regress against student test scores and family income and use them in the predictor model.
ReplyDeleteI'm thinking that there may be "unobservables" that differ between students going to different schools.
Agreed.
ReplyDeleteThe question I was trying to get at with the $5k comment was what in the respective college profiles caused that difference in predictions?
It was interesting to sort the data by highest predicted salary (only ~1140/7370 colleges have that though). The Ivy League is well represented at the top of that list.
I'm disappointed they did not have enough data to make predictions for the service academies. I have a feeling they would top the added value list. 3 of the top 16 median salaries (Harvard is 17) with mid/high math scores (~2 vs. ~3.5 for MIT and Caltech). It looks like the data they lack is that concerned with student loans (which makes sense). It would be interesting to rerun the analysis with placeholders for the loan data (i.e. 0).
ReplyDeleteOk, you graduate, can you get a job?...If you are an American citizen, maybe not. "A restructuring and H-1B use affect the Magic Kingdom’s IT operations"
ReplyDeleteThe self-reported PayScale data is suspect. PayScale and Glassdoor and such ilk have value but their error range is hard to quantify and I would guess a strong bias to the high end. People lie and the type of people looking for this info is not a random sample. Outside of guys like Steve with their earnings published so we can all know it real info is hard to get.
ReplyDeleteThe problem of these rankings (prestige, value added, etc.) is that they always refer to the past, i.e. reflect the situation of ten or fifteen years ago and have uncertain predictive worth.
ReplyDeleteThis.
ReplyDeleteIf salary data is coming from PayScale.com, how representative can it be? How do they try to account for selection bias?
In market economy, value of any thing is determined by supply/demand ratio. High demand/low supply ratio creates high value.
ReplyDeleteIf ivy leagues offered universal education for all, then their value would be no difference from high school education.
Steve, this is a bit off topic, but I'm assuming you're against affirmative action. If you were dictator and had total control over elite college admissions, how would you change it? Would you replace the holistic admission process with a national entrance exam, which is common pretty much everywhere else? The Ivy League schools seem to believe that if there are too many Asians, their name brand and prestige will be diluted, in turn hurting their alumni donations.
ReplyDeleteThey seem to put proportion of STEM graduates into the "actual outcomes" rather than "predicted outcomes" set of variables. This would bias their study in favor of engineering schools. The proportion of STEM should be considered a student characteristic and used to predict outcomes. A history major earning $80,000/yr is hitting it out of the park compared to an engineer earning the same, but this study would count them as equal if they had the same race, test scores, etc.
ReplyDeleteMaybe they did control for this properly but it doesn't look like they did. I will play with the data to see what I can find.
I'd like to see more focused studies that do more apples to apples comparison. For example, if I'm a high school senior and I want to go to an undergrad school that will give me a leg up in getting accepted at a Top 14 law school, are there any colleges that overperform relative to their degree of exclusivity? In other words, I already know that getting into Harvard College is a good first step toward getting into Harvard Law School, but what if I don't get into Harvard, Stanford, or Yale? What are some less exclusive schools that send a more than expected share of pre-law students to exclusive law schools?
ReplyDeleteFor example, Reed College has a reputation as a good undergrad college for students who want to become college professors. Is this true? If so, is it a selection effect or a nurture effect?
ReplyDelete