Saturday, February 9, 2013

Law review publishing: In search of a useful ranking system

I’ve published ten articles in law reviews, with an eleventh on the way.  Basically, the system works like this: I write an article, submit it to 50–100 different law journals, and wait for offers of publication. Then, after a series of emails with the editors of some of the journals, I have to decide which offer I should accept.  (After that, the article goes through a lengthy, and sometimes painful, editing process before it’s eventually published.)  My initial decision on where to publish has typically been guided by the US News rankings of law schools, which, in legal publication circles, is used as a proxy for the quality of a law school’s journal.  For example, the UCLA Law Review is published by the UCLA School of Law, which US News says is the fifteenth best law school in the land.  This means that authors would love to publish in the UCLA Law Review and, as a result, that journal may receive 2,000 or more annual submissions for about 12-15 available publication slots.  As we slide down the US News rankings—say, to the bottom fifty-or-so of our nation’s 200-plus law schools—the journals may receive only a couple hundred submissions for their 12-15 publication slots.

However, this method of choosing among publication offers is deeply flawed.  First, US News ranks law schools, not law journals.  And many argue that this “surviving rump of an otherwise defunct news magazine” doesn’t even do that properly.  Others have even argued that the US News rankings have played a large part in today’s legal education crisis.  But aside from all of that, after my fairly large sample size of ten published articles—by way of comparison, law professors have to publish about four articles to become tenured—I can state one thing with confidence: there is no correlation between a school’s US News rank and the quality of that school’s journal editors and editing process.

For example, I have published in the journals of law schools ranked in the 20s, 50s, 60s, and 70s, as well as a law school ranked #129 (Idaho).  Chief Articles Editor Allison Parker and the other editors and staff of the Idaho Law Review did an outstanding job in the editing and publishing process.  In fact, my experience with the Idaho Law Review may have been the best—or easily among the two or three best—of my ten journal publications.  (My experiences with other journals have ranged from excellent to poor; my biggest complaint is when editors actually insert their own errors—including punctuation errors, grammatical errors, and even more serious substantive errors—into my work.  When it comes to the act of editing, I wish law journal editors would follow this simple rule: less is more.)

To be sure, there are other means by which to choose among publication offers.  Washington & Lee University, for example, ranks journal impact, i.e., how often the journal is cited.  But while that’s interesting, it really doesn’t help me much.  First, it says nothing about the quality of the editing and publication process at that journal.  And second, just because a journal is cited more often, that doesn’t mean that if I publish in that journal, my article will be cited (or even read) more often.  Today, due to electronic research methods, few people sit down to read an issue of a law journal from cover to cover.  Instead, a researcher will search electronically for an article on a specific topic, regardless of the journal in which it is printed.  Therefore, to use a hypothetical example, an article in the Virginia Law Review might get cited a lot.  But that citation count is more a function of the article, not the journal, and would probably be the same had the author published in the Washington Law Review or any other journal.  Further, I suspect that article citation count may even be more a function of the author rather than the article.  That is, some law professors will agree to cite each other’s work in order to get their citation counts up.  It’s a case of “you scratch my back and I’ll scratch yours” in the hyper-competitive, ranking- and status-obsessed world of legal academia.

Given the major flaws in the two primary journal ranking systems, I would like to see a law professor develop a ranking methodology based on authors’ experiences with the publishing journals.  Law professors are already ranking nearly every imaginable thing under the sun—see, for example, here, here, here, here, and here.  And a “law review author ranking” would actually be meaningful.  I would love for a semi-mathematically inclined professor to run with this idea, and conduct an annual survey of authors (nearly all of whom will be his/her fellow law professors) in order to rank their law journal editing and publishing experiences. 

I’ll get the ball rolling.  The categories to be ranked could include: timeliness of the publication (on time = 10 points); time allowed for the author to review edits (two weeks = 10 points); deference to the author’s style (high deference = 10 points); creation of errors during editing process (no editor-created errors = 10 points); responsiveness to the author’s edits (short response time = 10 points); and quality of the journal’s website (an up-to-date website posting the article = 10 points).  Of course, there are probably a dozen other categories that could be included, but the total number of categories ranked should be few, and the respondents should be guaranteed anonymity, in order to induce participation by authors.

It is true that law review editors turn-over every year, and a new batch takes their place.  This means that a great experience with “Journal A” could easily have been a bad experience had the article been published a year earlier or later.  It is further true that some law professors—especially those seeking tenure—will, by necessity, continue to be slaves to the US News rankings when selecting among publication offers.  However, ranking the journals on the quality of their editing process would still do two important things. 

First, by ranking certain categories, such as whether the editors were deferential to the author’s writing style, authors would be clearly communicating to journal editors what they value in the publication process.  And most of the editors will likely respond by improving performance in these areas.  This, of course, would improve the publication experience for all authors, regardless of whether they chose their journal based on this “law review author ranking,” or some other system like the US News or the citation impact ranking. 

And second, if a particular journal ranks high, it will likely be a source of pride, which will transfer to the next year’s editorial board.  Similarly, if a particular journal ranks low, that too will be passed on, and will give the next year’s board the incentive to do better than its predecessor board.  Remember, rankings are powerful.  Law review editors are students, and some students do drastic, life-ruining things based on rankings, e.g., going into debt $150,000 or more to go to a law school ranked in the 20s instead of taking a full scholarship at a school ranked in the 50s.  Further, in today’s hyper-competitive legal job market, law review editors would love another line on their resume, e.g., “named editor-in-chief of school’s law review and improved 'law review author ranking' from #125 to #10.”       

I’ll be the first to contribute to the survey for the new ranking system: I score the Idaho Law Review a “10” in each of the above-named categories.  But until an ambitious professor runs with this idea and then publishes the “law review author ranking” on a fancy, well-funded website, I’m still left wondering how to choose among the offers (that I will hopefully receive) for my newest article, An Alternative to the Wrong-Person Defense.  It’s true, the submission season is very young and I haven’t received my first offer yet, so there’s plenty of time to devise a ranking system.  However, I’ve been mulling something over: I think I’ll rank the journals by their school’s football team’s performance last year.  For my purposes, at least, this seems to be as good of a method as any.  And while I absolutely love the Oregon Ducks, I have to remain dispassionate about this important decision.  Much like the final AP poll from last year, Oregon comes in second. The Alabama Law Review sits atop my ranking.  Roll Tide.


  1. Hi Michael, one point you might consider is that law reviews change yearly. A good review for one board does not really translate to the next year's board, and vice versa. I had two really bad experiences with two different journals because the particular boards had just really screwed it up that year, leaving a huge dog pile for the next board. So my current practice is to request the names of three authors currently in process and I contact those authors to ask the very relevant questions you identify.

    Regards, -bryan camp (Texas Tech)

  2. Bryan, thanks for reading and commenting.

    I did consider the board turnover issue. (You may have read my post on the Tax Prof Blog, which excluded that part in its abridged version of the post.) But I think that a ranking system would put future boards on notice regarding the things that authors value. (While some things like "timeliness" are obvious, many of the categories that I would like to see ranked are perhaps not so obvious to the editors.) Also, if the editors know they're being ranked, they'll want to do a better job in these areas. (Somewhat like new profs being ranked by their students?) I would love to do the survey and ranking myself, but I just don't have the resources.

    I might have to adopt your idea in the meantime. It's far easier to implement, and it does get more to the point by focusing on the SPECIFIC board that would be editing my specific article. (And, I have to confess, it MIGHT be better than my proposes "football performance ranking," above.)

  3. Some law reviews want an assignment of the copyright for the article being published. Often the law review will back down if the author refuses.

    Other law reviews simply want the exclusive right of first publication and a license to publish, reproduce, and distribute the article in any other media.

    Unlike the boards which change yearly, these are policies that usually remain unchanged.

    Law reviews that seek a copyright assignment should receive a scarlet letter, and give those law reviews reason to change the policy.