Ratings, Not Rankings:
Why U.S. News & World Report Shouldn't Want
To Be Compared to Time and Newsweek -- or The New Yorker


Nancy B. Rapoport [FNa1]

Copyright 1999, Ohio State University, Nancy B. Rapoport

The annual ranking of law schools by U.S. News & World Report (U.S. News) and other publications has been accompanied by an annual battle between those in favor of law school ranking systems and those opposed to such systems. The author argues that a significant problem with these ranking systems is that both law schools and potential law students are taking the rankings for far more than they are worth. The author argues that these ranking systems, such as the one developed by U.S. News, are flawed in that they take into account factors that do not reflect upon law school quality. A more useful system would be to present a prospective law student with various indicia of each law school's quality and to allow the law student to make his or her own personal rankings based upon the factors that he or she values most. The author argues that a second and more practical solution would be to use broad measures of quality to rate rather than rank the law schools.
It's the beginning of the academic year, and the latest U.S. News & World Report (U.S. News) ranking of law schools has been out for six months. In a few more months, it will be time for the annual battle of the ranking opponents versus the ranking proponents. The deans of most law schools will, once again, come out squarely against the methodology of the rankings and the fact that U.S. News ranks virtually every law school in the country, rather than just the top fifty schools that it ranks in other graduate programs. [FN1] The proponents will cite the need for winnowing the complex choices among law schools with some objective and standardized measures. [FN2] And then we'll all go into an infinite loop on the rankings war for yet another year.
Is there a real problem with ranking law schools? In one sense, no. Assuming that competing avenues exist for developing and providing rankings, [FN3] then the multiplicity of ranking systems would tend to dilute the effect of any individual ranking system. And assuming that the rankings have some objective basis in reality, [FN4] what harm can they do?
The answer is: plenty. The current popularity of the U.S. News rankings causes law schools and potential law school applicants to overreact to them. [FN5] Some law schools use the rankings as an outside measure of how "good" they are-especially according to the "academic reputation" component of the rankings. Unfortunately, the academic reputation score is not in the least a reliable measure of quality. U.S. News bases this score on the answers to questionnaires that request each respondent to rank all of the law schools in the country, from "marginal" to "distinguished." [FN6] No one in legal academia has sufficient information on each law school in the country to provide realistic rankings for each school, [FN7] and yet this part of U.S. News's ranking system accounts for twenty-five percent of each school's overall ranking. Responses by judges and lawyers on the same "marginal" to "distinguished" scale account for an additional fifteen percent of the overall ranking. Any law school that really relies on the reputational scores as a way of measuring quality is relying on a glorified coin toss at best.
It's bad enough that some law schools are measuring themselves against the U.S. News rankings, but potential law students are encouraged to take the rankings as gospel. I have great sympathy for the applicants who slog through the seemingly endless piles of glossy bulletins that law schools send to them. Naturally, these bulletins are going to paint the schools in the rosiest of glows, and naturally, applicants want some way of cutting through the hyperbole. But the rankings are not the best way to help applicants choose schools.
When U.S. News is not busy incorporating the "dartboard" approach of reputational scores into a school's overall ranking, it is using the allegedly objective measures, such as the applicants' entering undergraduate grade point averages (UGPAs) and Law School Aptitude Test (LSAT) scores, along with each school's acceptance rate, placement rate, and amount of fiscal resources. [FN8] Although each of these numbers is measurable, [FN9] none of these numbers is a good indicator of quality. These numbers don't reflect how well the law school teaches, how cutting-edge its research is, or whether the law school community is cutthroat or supportive. [FN10] For the students who choose to enroll in law school, only one of the U.S. News factors--placement rate--is likely to be of interest. [FN11] Why not come up with factors that mesh with what the consumers (the law school applicants) want to know?
While we're coming up with user-friendly factors, we should also do away with the rankings that go all the way from 1 to 181. FN12] Rating schools on some relevant factors, rather than ranking them from top to bottom, would serve applicants better. In fact, the rankings aren't really necessary, because all of the raw information that might be useful to prospective students is available in an easily accessible format. [FN13] To the extent that law students value particular factors more than others, they can construct their own rating systems. Ranking all of the law schools on the same scale, based only on the measurable factors that U.S. News collects, makes about as much sense as ranking the weekly news magazines-or even the weekly magazines, period. Does U.S. News want to go head-to-head with Time and Newsweek, or The New Yorker, on such factors as "perception of erudition"?
Of all of the justifications for its rankings, my favorite response by U.S. News to the "why rank" question is the snide "law schools do it, so we can, too":
We know that rankings and numerical assessments are an inescapable part of life. Law schools rely heavily on UGPA and LSAT scores when choosing students for admission. They rank, assess, and compare students continually between admission and graduation. There are occasional injustices in any ranking system. But just as law schools find test scores an inexact but useful tool for comparing students, so do students find our rankings an inexact but useful tool for comparing schools. [FN14]
The fact is, most law school admissions committees don't rank prospective applicants simply by their UGPAs and LSATs--they read the entire files. And when committees do use applicants' LSATs and UGPAs, they typically do so as part of an "index" number--a multivariate regression analysis of how well that school's weighting of LSAT and UGPA predicts that particular law school's first-year grades. These index numbers are recalculated each year by the Law School Admission Council to make sure that the index bears some relation to reality.
Law schools don't treat applicants just as numbers. The selection of a law school entering class depends on both the applicants' quantifiable indicia of academic success, such as their UGPA and LSAT numbers, and the non-quantifiable indicia of academic success (everything from the difficulty of the undergraduate major, the general quality of the undergraduate institution, prior public service or military experience, length of time since earning the undergraduate degree, any subsequent advanced degrees, or the amount of time that the applicant worked while earning his or her degree). Using just the numbers to choose a class makes little sense. Numbers can always be manipulated, and they never tell the whole story. [FN15]
Even current law students are not treated just as numbers. Other indicia of ability, beyond bluebook exams, are (or should) [FN16] be used to determine whether a law student "gets it" and will be a good lawyer. If we look at qualitative and quantitative factors to see how good our law students are, shouldn't we be looking at qualitative factors, in addition to quantitative factors, to rate law schools?
The bottom line is that U.S. News is looking at the wrong things. If you want consumers (in this case, law school applicants) to make informed decisions based on someone else's idea of quality, at least don't pretend that quality is calculable by assigning numbers to some selected and irrelevant factors. I could rank law schools by the height of faculty members, [FN17] but assigning a number to faculty height doesn't make that ranking valid, either.
The best solution is to let consumers make their decisions based on their own weighting of those factors that mean the most to them. The second-best solution is to provide broad measures of programmatic quality and rate schools along those broad measures, instead of ranking them from first to last. The rank-ordering just creates a false presumption that there's a real difference between first and fifth, or even first and twenty-fifth. If the world can tolerate ambiguity as to whether Time or Newsweek is measurably better than U.S. News, then it can probably tolerate ambiguity as to who has the best law school.

[FNA1]. Dean and Professor of Law, University of Nebraska College of Law. The views expressed in this Essay are hers alone, and not those of any other faculty or administrator at the University of Nebraska. Many thanks to Assistant Dean Glenda Pierce, Professor Julia McQuillan, and Jeff Van Niel for their very helpful suggestions on this Essay.

[FN1]. See, e.g., Law School Admission Council, Letter from Law School Deans to Law School Applicants (last modified Oct. 26, 1999) <http://www.law-services.org/deansbrochure.htm>.

[FN2]. See, e.g., Michael C. Krauss, How I Rate Rankings: High in Consumer Value; Surveys Help Keep Service Institutions Competitive, Advertising Age, June 15, 1998, at 46, 46 Rankings Reflect How the World Works, U.S. News & World Rep., Mar. 2, 1998, 7, 7 (noting that refusing to rank all of the law schools, from 1 to over 180, "would deny information to tens of thousands of potential students"). These proponents, though, do not tend to mention the boon to sales that come from the magazines publishing the ranking issues. See Jan Hoffman, Judge Not, Law Schools Demand of a Magazine That Ranks Them, N.Y. Times, Feb. 19, 1998, at A1 ("'These rankings are a misleading and deceptive, profit-generating commercial enterprise that compromises U.S. News and World Report's journalistic integrity,' said Carl Monk, executive director of the Association of American Law Schools.").

[FN3]. For a few of these alternative ranking systems, see generally Ian Van Tuyl et. al., The Princeton Review: The Best Law Schools (Gretchen Feder ed., 1999); Law School Rankings (last modified Nov. 7, 1996) <http:// homepages.gs.net/ <<degrees>> gentry/rankings.htm>; Jeffrey E. Stake, Indiana University School of Law Bloomington: The Ranking Game (visited Oct. 27, 1999) <http://monoborg.law.indiana.edu/LawRank/rankgame.html> (compilation of ranking sources). A recent national survey has been developed which seeks to measure an undergraduate university's use of "good" practices that encourage learning. See Ben Gose, A New Survey of 'Good Practices' Could Be an Alternative to Rankings, Chron. of Higher Educ., Oct. 22, 1999, at A65.

[FN4]. That's a big assumption. For a thorough critique of the U.S. News rankings, see Stephen P. Klein & Laura Hamilton, The Validity of the U.S. News and World Report Ranking of ABA Law Schools (last modified Feb. 18, 1998) < http://www.aals.org/validity.html>.

[FN5]. The U.S. News rankings create other problems as well. In this age of limited resources, universities are searching for ways to measure the quality of their academic units. There is always the risk that the universities in which the law schools are based will give the rankings more credence than they deserve and will allocate resources in a way that strangles real quality or innovation.

[FN6]. U.S. News describes its methodology as follows:
Reputation for academic quality was measured through two surveys conducted in the fall of 1998. The dean and three faculty members at each law school were asked to rate the quality of schools from "marginal" (1) to "distinguished" (5). Sixty-two percent responded, and the resulting reputation score accounts for 25 percent of the school's rank. Practicing lawyers, hiring partners, and senior judges were also asked to rate each school. Thirty-nine percent responded, and their opinions account for 15 percent of the final rank.
See U.S. News Online, Law: Methodology (visited Oct. 27, 1999) <http://www.u snews.com/ usnews/edu/beyond/gradrank/gblawmet.htm>.

[FN7]. Cf. Bruce Keith & Nichols Babchuk, The Quest for Institutional Recognition: A Longitudinal Analysis of Scholarly Productivity and Academic Prestige Among Sociology Departments, 76 SOC. FORCES 1495, 1445-1500 (1998) (noting that the same issues exist in rating prestige of other academic units).

[FN8]. See U.S. News Online, supra note 6.

[FN9]. Not only are these numbers measurable, but they're manipulable as well. There are numerous ways to manipulate the rankings. For example, it would be possible to raise a given law school's median UGPA and LSAT scores by choosing half of an entering class solely on UGPAs and half solely on LSATs. But is that the best way to choose a law school class?

[FN10]. See Klein & Hamilton, supra note 4 (citing some of these non-quantifiable, but important, measures of law school quality); see also Gose, supra note 3, at A65 (discussing a survey that does not use quantifiable factors to rank undergraduate colleges, but instead seeks to measure a college's use of educational practices that encourage learning). One interesting new study, which hit the legal trade papers last October, comes from Thomas M. Cooley Law School. That study ranks Creighton University School of Law as number one in the country-and Harvard as number ninety-three-based on a "value added" measurement: entering credentials versus bar passage rate. See New Law School Study Ranks Creighton at No. 1, Harvard No. 93, L. Wkly. U.S.A., Oct. 5, 1998, at B17. Although I'm not a fan of this study's decision to rank schools, I thought that its attempt to use a value-added approach was at least a slightly more realistic way of evaluating quality.

[FN11]. U.S. News used to include information about starting salaries. Apparently, though, the enormous regional differences in starting salaries made this factor less attractive.

[FN12]. Yes, I know that schools are listed alphabetically beyond the "first tier." But it's still a top-to-bottom ranking system.

[FN13] See generally Law School Admission Council, The Official Guide to U.S. Law Schools (1999) (providing raw information for ABA-accredited law schools); Section of Legal Educ. & Admissions to the Bar & the Office of the Consultant on Legal Educ. to the American Bar Ass'n, The Official American Bar Association Guide to Approved Law Schools (Rick L. Morgan & Kurt Snyder eds., 2000 ed.) (same).

[FN14] Rankings Reflect How the World Works, supra note 2, at 7.

[FN15]. For example, the bulletin of the University of Nebraska College of Law explicitly states that admission is not "a function of the numbers," and it lists a variety of non-quantifiable indicia of academic success that can be factored into the admissions decision. University of Neb.-Lincoln, College of Law Bulletin 10 (1998).

[FN16]. This is another hobbyhorse of mine. I fail to see how the ability to take bluebook exams predicts with great reliability whether someone will be a good lawyer. But I'll save that tirade for another essay.

[FN17]. Or, in my case, inverse height.