Pages

Monday, June 23, 2014

Alternative Rankings for Law Schools and Some Thoughts on the Impact of Scandals

Over at The Faculty Lounge, Alfred Brophy has a post where he discusses his paper, Ranking Law Schools with Lsats, Employment Outcomes, and Law Review Citations, which was recently posted on SSRN. Here is the abstract:

This paper returns to the perennially favorite topic of ranking law schools. Where U.S. News & World Report includes a wide variety of factors – some of which are criticized as irrelevant to what prospective students care about (or should care about) -- this paper looks to three variables. They are median LSAT score of entering students, which seeks to capture the quality of the student body; the percentage of the graduating students who are employed at 9 months following graduation at full-time, JD required jobs; and the number of citations to each school’s main law review. This paper rank orders each of those variables, then averages those ranks to obtain a new ranking; then it compares those new rankings to the U.S. News & World Report rankings.
Brophy includes a reorganized list of schools' rankings and indicates which schools are affected the most by the new ranking system. However, since Brophy's focus on the gains and losses is based only on the raw numbers of how the ranks of the school shifted, the biggest ranking shifts that result from the metrics change do not include schools that rank very highly on the US News Rankings. But there are a few noteworthy shifts that Brophy's approach causes among schools that rank highly in the US News Rankings. Under Brophy's ranking system, Columbia jumps to first from tied for fourth on the US News rankings. Georgetown ties for 8th rather than 13th. And Chicago suffers a notable drop from being tied at 4th in the US News rankings to tied for 10th in the new rankings.


Brophy's approach simplifies the ranking process by removing many of the factors that US News considers, including the heavily-weighted peer-assessment scores, where law school rankings are partially determined based their reputation among professors, judges, and practitioners. Brophy replaces the peer-assessment score with the number of citations to a school's flagship law journal, noting that this alternative tends to correlate to the peer-assessment score.

I'm not sure whether I agree with Brophy's use of the school's law review citation rate in his ranking system, but he advances some good arguments in its favor. And there is another advantage of looking to law journal citations rather than peer-assessment scores that Brophy does not discuss, which is that the citation approach results in scandals having less of an impact on law school rankings. Occasionally, law school administrators do some pretty bad things, and, if publicized, this bad behavior can have an significant, detrimental impact on the students who are stuck with the ensuing ranking decline.

One example of such a scandal involves the University of Illinois College of Law. In 2011, the school announced that the dean of admissions had been falsifying GPA and LSAT data for its students by reporting inflated numbers in an effort to boost the school's rankings. Once this came to light, the school dropped 12 points in the 2012 ranking, and then another 12 points in 2013, going from 23rd to 47th on the US News rankings. The school's ranking has leveled off since then, and is now 40. The initial drop was probably partially due to correct numbers finally being reported. But some of the decrease in 2012 and 2013 was almost certainly due to a lower peer-assessment rank.

In the spring of 2011 I was choosing among law schools, and the University of Illinois was among my top considerations. They had given me a very appealing scholarship offer, and even though I had a nice offer from the University of Iowa, the change of scenery and the higher ranking that Illinois offered led me to decide early in my search that if I were to remain in the Midwest, I would choose Illinois over Iowa. The prestige and sunshine of UCLA won me over in the end, but looking back, I realize that I came very close to enrolling in a school that would be ranked far lower upon my graduation than it had been when I was considering enrolling.

While the dean of admission's conduct was deplorable and while the drop in rankings was unsurprising, I think that the rankings drop that Illinois suffered was unfairly hard on students who chose to enroll in the fall of 2011. Speaking from the perspective of somebody who was choosing a law school at that time, I think that students who chose Illinois made a pretty good decision given the information available at the time, and they would have been completely blindsided by the scandal. Once the news came out, the first year law students would be faced with the difficult choice of transferring, or remaining in a law school that would likely suffer a precipitous decline in prestige. Second year law students would be pretty much out of luck.

Brophy's rankings indicate that Illinois would be ranked at 30th under his approach rather than being tied for 40th in the US News Rankings. I suspect this is largely due to his avoidance of the peer-assessment factor. While administrators who try to fudge the numbers to increase their schools' rankings should be held accountable, I think that a ranking system that avoids disproportionate harm to those schools' students should be encouraged. For that reason, Brophy's rankings are deserving of consideration and discussion.

UPDATE - 7/3/2014

Brophy recently noted that he has expanded his paper to include all of the schools ranked by US News and is now using two measures of employment. I have not changed the link to his paper because the updated paper appears at the same location, but I thought that this update might be worth noting.

No comments:

Post a Comment