A new study of more than 500 Nevada attorneys has found that bar exam scores have virtually no meaningful relationship with how effectively lawyers perform their jobs, raising fundamental questions about whether these high-stakes tests serve their intended purpose of protecting the public from incompetent attorneys.
The research, published in the Journal of Law and Empirical Analysis, represents the first comprehensive examination of whether bar exam performance actually predicts lawyering effectiveness. The findings suggest that the exams may function more as barriers to entry than as reliable indicators of professional competence.
“The bar exam’s use to determine who possesses the minimum competence to practice law should be questioned and subjected to further rigorous study,” the researchers concluded after analyzing the relationship between first-time bar exam scores and professional evaluations from supervisors, peers, and judges.
Minimal Predictive Power
The study, led by researchers from institutions including San Diego State University and UC Law San Francisco, evaluated 524 lawyers admitted to the Nevada Bar between 2014 and 2020. Each attorney’s bar exam performance was compared against ratings of their professional effectiveness using scientifically validated measurement tools developed over decades of research.
The results were striking in their consistency: across all components of the bar exam and all types of evaluators, the relationships between test scores and professional effectiveness were either nonexistent or so small as to lack practical significance.
The largest effect found was a mere 4% increase in effectiveness ratings from judges for attorneys who scored one standard deviation higher on the overall bar exam. Most relationships were even weaker, with some showing no connection at all.
Breaking Down the Components
The researchers examined each major component of the Nevada bar exam separately:
- Multistate Bar Examination (MBE): The multiple-choice component explained no more than 10% of variation in effectiveness ratings
- Essay components: Subject-specific essays showed minimal predictive value, explaining just 2% of variation in ratings
- Multistate Performance Test (MPT): This practical skills component explained only 3% of variation in effectiveness ratings
- Ethics essays: Even questions specifically focused on legal ethics showed no meaningful relationship with professional performance
“These findings suggest that the bar exam, as it was administered in Nevada, does not result in scores that serve as bases for valid inferences about predicted lawyering effectiveness,” the study authors wrote.
What Makes an Effective Lawyer?
The study used evaluation criteria developed by professors Marjorie Shultz and Sheldon Zedeck, who identified 26 factors crucial to lawyering effectiveness through extensive research involving thousands of attorneys. These factors include analytical reasoning, communication skills, problem-solving ability, and professional relationships.
Importantly, the study found that lawyers’ self-assessments were actually lower than evaluations from supervisors, peers, and judges, suggesting that self-evaluation responses were not inflated. The research also found that approximately half of those who failed the bar exam on their first attempt received effectiveness ratings at or above average, further undermining the exam’s predictive value.
Implications for Reform
The study’s timing is particularly relevant as the legal profession undergoes significant changes. The National Conference of Bar Examiners is developing a “NextGen Bar Examination” scheduled for release in 2026, though it will retain similar components to current exams.
Some jurisdictions are already exploring alternatives. Nevada’s Supreme Court has recently considered recommendations for a supervised practice requirement similar to medical residencies, which the study’s findings appear to support.
The research suggests that practical experience may be far more valuable than test performance. The study found that time elapsed since bar passage was positively associated with lawyering effectiveness, indicating that attorneys improve through practice rather than through their initial test scores.
Broader Questions
While the study focused specifically on Nevada, the researchers suggest their findings likely apply more broadly, since most states use similar exam components developed by the same national organization.
The study’s limitations include its focus only on attorneys who eventually passed the bar and entered practice, meaning it cannot assess the potential effectiveness of those who never achieved a passing score. The researchers also noted that effectiveness ratings were concentrated in the upper ranges of the scale, potentially limiting the ability to detect relationships.
Despite these limitations, the researchers argue their findings represent the strongest evidence to date that bar exams may not fulfill their stated purpose of identifying minimally competent attorneys.
“More research is needed, but this study finds that while the bar is serving as a significant barrier to the practice of law, there is little indication that it is a robust indicator of what it takes to be a ‘good’ lawyer,” the authors concluded.
The study “Putting the Bar to the Test: An Examination of the Predictive Validity of Bar Exam Outcomes on Lawyering Effectiveness” was conducted by Jason Scott, Stephen Goggin, Rick Trachok, Jenny Kwon, Sara Gordon, Dean Gould, Fletcher Hiigel, Leah Chan Grinvald, and David Faigman, and published in the Journal of Law and Empirical Analysis. The full study is available at here.