Why did the White House abandon college rankings?
Loading...
In his weekly address this Saturday, President Obama introduced the revamped College Scorecard website, meant to help prospective university students “identify which schools provide the biggest bang for your buck.” After more than a year of planning, however, the new site’s mountain of data (171 megabytes, all told) is missing its most buzzed-about feature: rankings.
Head-to-head rankings, such as those compiled by U.S. News and World Report, which posted the 2015 installment just last week, tend to invite controversy. Critics accuse the magazine of sending parents, high-schoolers, and college deans into a frenzied keeping-up-with-the-Joneses contest each fall, their obsession over debatably superficial numbers leading to real consequences for student learning and debt.
But as most skeptics acknowledge, ratings themselves aren’t the problem; unhelpful metrics are. On most popular lists, the same small club of elite schools jostle with each other, fighting over a one- or two-place difference, yet they serve a minuscule percentage of the country’s college students.
“Think of these top colleges as high-end luxury cars,” writes New America Foundation Senior Policy Analyst Ben Miller in a debate piece on university rankings (sponsored, interestingly enough, by U.S. News themselves). “You're already guaranteed a better vehicle than 95 percent of all drivers, so beyond price considerations, the relative differences are largely cosmetic.”
The White House has made it a priority to provide real solutions for the rest of America’s students, and has long planned to publish more “Consumer Reports-style” rankings, as NPR described them, focused on student outcomes like income and debt load. In addition, the White House has laid out an ambitious, $60 billion plan to make two-year community colleges or technical schools “as free and universal as high school," a proposal that faces an uphill battle in Congress.
Although community colleges are “the workhorses of higher education,” enrolling nearly half of all post-high school students, their graduation rates are abysmal: Only 25 percent finish in three years, and, on average, those who do finish take five years to do so, according to a 2010 Christian Science Monitor editorial.
In light of these statistics, Obama hoped to create what Mr. Miller calls a “buyer beware” system. But universities, some of which already refuse to participate in the U.S. News rankings, immediately protested that measuring all of their diverse missions, student bodies, and resources with a single measuring stick would not accurately capture each campus’s value, or lack thereof. While some schools’ objections seem to carry students’ best interests at heart, observers like Paul Glastris, who oversees an alternate ranking at the Washington Monthly, point out that keeping some of this information in the dark plays to schools’ advantage, as well.
College Scorecard won’t tell you whether Princeton is better than Yale, but it does highlight schools which stand out in particular categories: “23 four-year schools with low costs that lead to high incomes,” for example, where the selected 23 are displayed in alphabetical order, rather than by ranking.
Applicants who know what matters most to them – total cost, loans, income, size, etc. – can customize the site’s metrics to create a personalized de facto ranking system. So long as that’s the case, the Scorecard may well serve its original purpose, as explained by under secretary of education Ted Mitchell to The Chronicle of Higher Education: “public accountability.”
Expenses, selectivity rates, financial gain: these, and a bevy of other indicators, can all be boiled down into Scorecard statistics. What's harder to capture is how much a student will, or won't, learn.