This past year, criticism of the U.S. News and World Report rankings system reached fever pitch, in part, as a result of perceived flaws in the rankings methodology. Even normally demure bloggers like myself joined the fray, "throttling" U.S. News for how they prioritize statistics in their rankings. In short, I think both the weight given to each criterion and the criteria themselves are not the best way to measure excellence, academic or otherwise.
Yet this year other, more practical concerns came into play. First of all, there have been highly publicized cases of colleges and law schools gaming the system. Some law schools changed the admissions processes for their full- and part-time programs; others declined to submit employment statistics because they knew the automatic minimum U.S. News inserts was higher than their stats. Clemson University is perhaps the most scandalous example, when a former admissions officer outlined their back-handed efforts to break into the top 25 universities at a presentation. From the Inside Higher Ed report:
[Describing] Clemson's single-minded pursuit of its widely publicized goal of becoming a top 20 public research university,...Watts asserted--among other things--that Clemson officials...had systematically lowered class sizes below the U.S. News threshold of 20 while increasing class sizes where the additional students wouldn't hurt its standing in the rankings; and that they had regularly given low scores on the rankings' "reputational" survey to other colleges and universities in order to make Clemson look better.
Not only did she game the system; she did so with the explicit mandate the administration. No good. Add to that the studies showing that the U.S. News rankings system hurts diversity efforts, and it becomes crystal clear that something needs to be done.
Maybe U.S. News buckled under the pressure; maybe they just saw the error of their ways. Either way, excuses have been made and change has been proposed. Here, excerpted and condensed from Bob Morse's blog Morse Code (see the full post here) is a list of what changes may impact the 2011 U.S. News & World Report school rankings:
We may add high school counselors' rankings of colleges as part of the academic reputation component, which is now 25 percent of the America's Best Colleges rankings. To do this, we would reduce the weight of the regular peer assessment survey for the National Universities and Liberal Arts Colleges categories. We are considering adding the admit yield--the percentage of students that the school accepts that enroll at that school in the fall--back into the rankings...[as] part of the undergraduate academic reputation index variable. We may slightly increase the weight of the "predicted graduation rate,"...a well-received variable by some higher education researchers, because it measures outcome and rewards schools for graduating at-risk students... We are contemplating eliminating the Third Tier from all the National Universities, Liberal Arts Colleges, Master's Universities, and Baccalaureate Colleges rankings tables... We would extend the numerically ranking to the top 75 percent of all schools in each category, up from the top 50 percent now... We believe that the data is strong enough to numerically rank more schools, and the public is asking for more sequential rankings since it's less confusing than listing schools in tiers.
Morse also wrote a separate post, detailing some changes that may be made to how they categorize the schools (e.g., Baccalaureate Colleges to Regional Colleges, etc.) for the sake of clarity.
At the end of both posts, Morse solicits comments from his readers. He even responds to trends in some of those comments with a later post, "Your Thoughts--and Our Responses--on College Rankings Changes". Yet, it seems to me that many of the concerns that sparked these proposals in the first place haven't really been addressed.
Take the triple-pronged objection to the academic reputation component, also commonly referred to as "prestige rankings." The methodology pretty simple: U.S. News provides deans and members of the faculty with a list of schools and asks them to rate the schools on a scale of one to five. Detractors question the practice on two fronts: they ask both whether these are the right people to be asking, and whether this is the right way to ask them. The potential change resolves only one of these concerns. Critics also point to a spotty response rate, which tends to hover somewhere between 65 and 70 percent. What's more, the third, more fundamental critique goes entirely unanswered. Namely, why are peer assessment rankings are worth so much--a full quarter of a school's ranking--and, for that matter, should the metric be used in the first place? Morse's response is, well, unsatisfying:
U.S. News is not going to drop the concept of doing peer assessment or including it in the upcoming 2011 edition of the America's Best Colleges rankings... We feel that academic reputation of the school where a new graduate has obtained their degree is a very important factor and that a school's reputation can impact the ability of new college graduates to get that all-important first job or to get into a top graduate school.
Granted, it's not as if the "this is important because other people think this is important" justification isn't completely reasonable for a rankings system that purports to find the "best" (a highly subjective category). However, I find this explanation unsatisfying because prestige is worth so much compared to other very important measures of excellence, like selectivity and resources, and because Morse doesn't address the methodological issue at all.
Morse and U.S. News also haven't addressed the diversity issues or, for that matter, taken into account the several different ways schools can, and have, gamed the system. In his article on law school gaming, Morse only mentions one step in that direction:
U.S. News is planning to significantly change its estimate for the at-graduation rate employment for nonresponding schools in order to create an incentive for more law schools to report their actual at-graduation employment rate data. This new estimating procedure will not be released publicly before we publish the rankings.
I could go through this whole procedure for each point of possible change, but let me cut to the chase: I don't think these changes go far enough. They only nominally or obliquely deal with some very serious critiques. I also think that, while yield rate and graduation are positive steps, these changes do not make up for the fact that, in the end, the U.S. News rankings focus too much on prestige and standardized testing, and too little on what the school has to offer to its students.
--Written by Madison Priest
Want to be found by top employers? Upload Your Resume
Join Gold to Unlock Company Reviews