I've blogged about this year's decline in performance yesterday. There, I described the unpleasant situation of California bar examinees who have to read about the nationwide drop in bar exam performance before they have even received word of whether they have passed.
Here is an excerpt from Moeser's memo where she addresses this decline:
Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013. In July 2013 we marked the highest number of [Multistate Bar Exam, or] MBE test-takers. This year the number of MBE test-takers fell by five percent. This was not unanticipated: figures from the American Bar Association indicate that first-year law school enrollment fell 7% between Fall 2010 (the 2013 graduating class) and Fall 2011 (the 2014 class.) We have been expecting a dip in bar examination numbers as declining law school applications and enrollments worked their way to the law school graduation stage, but the question of the performance of the 2014 graduates was of course an unknown.
Amar argues that Moeser's claim that the July 2014 examinees were "less able" than other groups of students who took the exam was likely a mistake of wording and that a mere decline in law school enrollment cannot adequately explain the drop in exam performance. From Amar's article:
For that reason, Ms. Moeser probably erred (putting aside her choice of language) in trying to offer any explanation for the lower performance; her diagnosis of a “less able” group of takers seems to be, at most, a (limited) diagnosis of (partial) exclusion. In other words, what she knows—or should be able to know—is confined to the fact that the MBE test that was given in 2014 was reliable as compared to prior year tests. Even if this year’s test was no different in substance or administration, Ms. Moeser really has no way of accounting for the lower performance. Certainly her vague implication—that a decrease in the volume of law school applications and graduating students explained the lower score—is open to question. Indeed, a seven-percent reduction in the number of starting law students in the fall of 2011 might suggest that law schools shrank in size rather than lowered their admissions standards. And the comparison of seven-percent fewer incoming students and five-percent fewer MBE takers wouldn’t, without more data, say much. So Ms. Moeser should have said no more than that the test has been examined and validated, and that we need to look elsewhere for an explanation.
In this regard, [Brooklyn Law School] Dean [Nicholas] Allard is correct that the rest of us deserve to know more details about MBE’s “quality control” processes, to use Ms. Moeser’s term. It’s hard to see why more transparency about the internal test-validating data and techniques that the MBE-makers use would not be a good thing.Amar is referring to this letter by Dean Allard that responded to Moeser's memo. In that letter, Dean Allard pointed out that graduating from an accredited law school takes a great deal of intelligence and work, that it "defies common sense" that students who have completed three years of law school need to spend even more money on a bar exam preparation course, and that "in short, it is not the students, it's the test."
I have some disagreements with some of Amar's and Dean Allard's remarks. Regarding Amar's article, I don't think that it is possible to interpret Moeser's use of the "less able" label to simply denote that students performed poorly on the Multistate Bar Exam. Moeser clearly refers to the "less able" description as an explanation for the poor performance, so it would not make sense to interpret her explanation in this diplomatic way.
But unlike Amar and Dean Allard, I don't find the "less able" description to be necessarily offensive. Faced with a decline in performance of students across the nation, an obvious potential explanation is that those taking the July 2014 bar exam were less prepared for the bar exam than students in other years. This could mean a number of things -- for example: that they were less intelligent, that their law school education was poor preparation, or that they had less time or intellectual energy to devote to preparing to the bar exam (perhaps because they were simultaneously searching for jobs that are increasingly difficult to find).
Take my opinion with a grain of salt, since I am one of these "less able" students, but I think Dean Allard was overly hasty in taking offense. I agree with Amar and Dean Allard that more research into the issue is necessary, and that Moeser may not be able to conclusively claim that the students are the reason for the poor performance on the exam.
But I think that Dean Allard's response opens itself up to criticism -- especially when he points out that students spend thousands of dollars on bar exam prep courses after taking three years of law school courses. This remark highlights a problem with law school, rather than the bar exam: if law school adequately prepared students for the bar exam, then they would not need to take the bar exam prep course. While I have many good things to say about legal education, and while I acknowledge that the situation is indeed a complex one, I think that Dean Allard's response needed to be more measured in order for it to be more credible.
I agree with Amar that more research into the issue is necessary, but I don't think that Moeser was relying solely on the number of students taking the exam in reaching her conclusions. The NCBE tests its MBE questions by including unscored experimental questions on its exams in preparation for following years' exams. I imagine that the questions used on the July 2014 exam were tested and performed comparably to previous years' questions -- which is one reply the NCBE will certainly be able to make to those criticizing its methods.
There is, however, one feature of this year's exam that may count against Moeser's claim that students were the reason for their own poor performance. Moeser mentioned in her memo that civil procedure will be an MBE exam topic starting in February. During my bar exam preparation course, one of the instructors mentioned that the July 2014 MBE may include test questions for civil procedure in light of its inclusion in future exams. While these questions would be unscored, they would be clearly different from other MBE categories. And if students were not adequately warned about the inclusion of civil procedure test questions in a year where civil procedure was not an MBE topic, these test questions could confound students and lead them to waste time and energy on trying to figure out what MBE category the civil procedure questions fit into. I don't know if it is possible to examine how adequately students were prepared for the possibility of the MBE including civil procedure test questions. But if many students were not prepared, this could be one explanation for the July 2014 exam being an outlier.
One might point out that as somebody who took the July 2014 bar exam, I am in a position to potentially confirm whether civil procedure questions were on the MBE. But while you might think that, I couldn't possibly comment due to the dozens of confidentiality agreements I signed before taking that exam. As far as my own experiences with the MBE are concerned, I'm going to remain silent.
In any event, law school enrollment has continued to decline, so the next several years will bring more opportunities to see whether this declining enrollment is indeed the explanation for falling performance on bar exams. More research is necessary, and the data for that research will gradually become available.