When defending graduate legal education, many assume that it requires deference as the traditional American approach. To the contrary, the requirement of an undergraduate degree before law school admission did not become standard until the 1960s.[1] And while much of the push for making law a graduate degree came from the belief that lawyers needed liberal arts training, more bigoted sentiments found their way into the debates.
In 1929, Henry Drinker, later the author of a leading treatise on legal ethics, argued that the American Bar Association should require two years of college before law school. He explained why “a two year college course preliminary to the three year course of the law school [is] calculated to produce a better type of lawyer.” As “Chairman of the Committee of the Grievances of the Law Association of Philadelphia,” he had observed that lawyers “that came up out of the gutter and were catapulted into the law, have done the worst things and did not know they were doing wrong.” In particular, “of the men who came before us who had been guilty of professional abuses, an extraordinarily large proportion were Russian Jew boys.” What these lawyers lacked was what Drinker learned in college – not “book learning,” but “the American spirit of fair play.”[2]
Two Carnegie Reports provided the intellectual foundation for those who resisted efforts to require undergraduate education. In contrast to the 2007 Carnegie Report that sought to reform graduate legal education, the 1921 and 1928 reports authored by Alfred Z. Reed recognized that different levels of legal training were appropriate for different segments of the market for legal services.
While this historical context is of course not dispositive of whether to make the basic law degree an undergraduate degree today, it does suggest that the status quo of graduate legal education should not automatically receive the benefit of the doubt.