The Law Journal Rankings Project assists legal scholars to locate and evaluate law journals by subject, country, publication type, or ranking (where available), to display journal editorial information, and to facilitate an author's article submission to those journals.
The site currently ranks journals based on citation data from a 2009-2016 survey period. The site is updated annually. In April 2018 the site will be updated with data from a 2010-2017 survey period.
Most bar journals, magazines, and newsletters are excluded from this list. Also excluded are law journals that have few English language articles (except for a few U.S. Spanish language and Canadian French language journals). Journals are displayed alphabetically by journal name. If ranked by making selections in the checkboxes in the main menu area, then the list will sort in rank order. Clicking on the column headers for each numeric rank list will re-sort to that rank. Clicking on the "Journal" header above the journal titles column will re-sort alphabetically. Journals that have ceased publication or have not published at any point during the current survey period are removed.
General/Specialized, Country, and Student-edited/Peer-edited/Refereed Selections
By selecting the appropriate checkboxes the journal list may be limited to combinations of:
|General, specialized||"General" refers to a journal as the main "flagship" journal of an institution (usually a law school) and has no subject speciality. In the pull-down Subject menu, a journal may have a "General" classification but also one or more other subject classifications because the journal has a tendency to publish in certain subject areas|
|Country (U.S., non-U.S., or another specific country)||The country of publication|
|Student Edited, Peer Edited, and Refereed journals||"Student edited" means a student run journal that does not send articles out for peer review.
"Peer edited" means a journal that is edited by professionals in the field
"Refereed" means a journal that routinely sends article submissions on for peer review by members of a diverse professional group.
Student edited or Peer edited journals may also be refereed, in which case the journal will be listed as "refereed".
The pull-down subject list allows the journal lists to be limited to journals that fall within broad subject areas.
Journal Name Search
Using the "Jnl-name words" box, the journal list can be limited to those journals whose names contain the entered words. The connector "AND" is presumed between two words. Alternate journal names, including name changes, are also searched.
Limiting to journals within a Range of Rankings
Using the "Rank" box the number of displayed journals can be reduced to the range number selected. A range of rank numbers can be entered using "," or "-", e.g. "31,33,35,37-43,46-50". It's best to look at a full ranking list before limiting it because rankings are bunched so that there may e.g. be four 3rd ranked journals and thus the next ranked journal would be 7th. So if you entered a range limit of 4-6 it would match with nothing in the ranking and your rank request would be ignored.
The impact-factor weighting determines the proportionality between impact-factor and total cites which comprise the combined-score calculation for each ranked journal.
Checkboxes next to Journal Names
The Journal checkboxes are used to limit the list of journals. If none are checked then the checkboxes are ignored and the output list is governed by the boxes and buttons chosen in the menu. Any journal checkboxes that are checked will be used to limit the menu choice output. Note that, if for example, a string of "yale" is entered in the menu to limit journals to only those with "yale" in their name, and then before clicking "Submit" some journal checkboxes are checked where the journal names do not include "yale", these choices will be inconsistent and will result in zero journals being listed.
Exclude Online Journals
Checking the "Online-jnls" box will exclude online journals from the displayed list. "Online journal" meaning journals that publish exclusively electronically, not producing printed issues.
Some journals in the list have not been ranked, i.e. no search has been run against Westlaw's JLR and ALLCASES databases. Some journals have recently begun publication and will eventually be ranked, others (most will be non-U.S journals) are listed in order to include their editorial information. Nonranked entries can be excluded from listings by checking the "Nonranked" box.
Information About the Journal
Clicking on the journal name retrieves information about the journal such as web address, and article submission information. The record retrieved by clicking on the journal name will display links to "OpenURL" resolvers that can attempt to find full-text sources for the journal (whether or not the source can actually be accessed by the user will depend on licensing restrictions).
Counted citations are those which cite journal volumes published in the preceding eight years. The reason for this limit is to prevent a bias in favor of long-published journals. Thus the study is concerned only with citations to current scholarship. The search results give only the number of citing documents, and do not show where a citing article or case cites to two or more articles in a cited legal periodical. Sources for the citation counts are limited to documents in Westlaw's JLR database (primarily U.S. articles), and in Westlaw's ALLCASES database (U.S. federal/state cases). The searches conducted in those databases generally look for a Bluebook format in use in the U.S. (volume journal [page] year) but the searches are flexible in allowing the year to occur within 8 words of the journal name.
The list includes periodicals that began publication after the survey period began. Rank results based on total citation counts are unfair to those periodicals, and whenever a journal recently began publication a warning has been supplied next to the periodical name in the form of a parenthetical date such as "(2001- )". Both impact-factor and combined-score rankings do make an allowance for how recently the journal began publishing. Legal periodicals which appear to have ceased publication (even though they were published during a part of the survey period) are excluded.
The "Journals" column(s) shows the number of articles that cite to each journal (within our date period) that were found in the full-text Westlaw journals database "Journals and Law Reviews (JLR)". To see what sources are included in the JLR Database see the Westlaw description at http://goo.gl/2G10s. The scope note in that Westlaw description describes the JLR content as, "The JLR database contains documents from law reviews, CLE course materials, and bar journals. A document is an article, a note, a symposium contribution, or other materials published in one of the available periodicals".
The "Cases" column(s) shows the number of cases that cite to each journal (within our date period) that were found in the full-text Westlaw state and federal case database "Federal & State Case Law (ALLCASES)". To see what sources are included in the ALLCASES Database see the Westlaw description at http://goo.gl/iqnkD. The scope note in that Westlaw description describes the ALLCASES content as, decisions from the "U.S. Supreme Court, courts of appeals, former circuit courts, district courts, bankruptcy courts, former Court of Claims, Court of Federal Claims, Tax Court, related federal and territorial courts, military courts, the state courts of all 50 states and the local courts of the District of Columbia."
Comparisons between older and newer annual surveys cannot be made precisely. Although the number of years covered by each ranking column is an identical 8 years, the size of the JLR and ALLCASES databases for each of the rotated periods increases in size a few percent in each later period. Thus, a small percentage increase in the number of documents citing to a journal may be accounted for by an increase in the total number of documents in the database. The composition of the JLR database is also subject to change with full-text periodicals being added or dropped. For example, should Westlaw add a few new Canadian journals to the JLR database that could have a strong impact on total citation counts to Canadian law journals.
Searches for citing documents (taking the 2002-2009 period as an example) usually look for citations within the full-text articles/cases that have one of the volume numbers published for the journal from 2002 onwards (i.e. where the journal has labeled the issue as 2002..2009), followed immediately by the journal abbreviation/name, and within 8 words a year designation of 2002-2009. A further condition is that any document (case or article) in which such a citation occurs must be dated (in Westlaw's 'date' field) as 2002-2009, and must have been added to the Westlaw JLR or ALLCASES database during the years 2002-2009. These dates should be adjusted for an explanation of other rotated 8 year periods.
A typical looking search in Westlaw would be:
text(65 66 67 68 69 70 71 72 +1 "alb.l.rev." "albany.l.rev." (alb albany +2 "l.rev." "l.r." (l +1 rev review) (law +2 rev review)) /8 2002 2003 2004 2005 2006 2007 2008 2009) & DA(>2001 & <2010) & AD(>2001 & <2010)
Anyone wishing to see the text of the search run for a particular journal in the JLR and ALLCASES databases should send a request to email@example.com
As the searches are full-text searches they are naturally prone to some error due to variant citation form in the citing cases/articles. Searches for citation patterns roughly follow the Harvard Blue Book format, usually, VOLUME JOURNAL ... YEAR. Citations that are not in the usual format for legal citations may not have been found. Effort was made to allow for different forms of journal name citations (e.g. allowing for the ALWD citation format) but not all can be retrieved. The citation counts for citations to non-U.S. periodicals are likely to be less accurate than those for U.S. periodicals because non-U.S. legal citation formats are often severely abbreviated.
Westlaw's treatment of periods and spaces within character strings is difficult to understand and is part of the reason why so many alternate cite forms were used.
Quotes are not used around embedded periods, for example, in looking for "HARV. L. REV.", the search HARV +2 L +1 REV would be used.
After a "word" ending with a period, the word is followed by a +2 as in HARV. +2 L.
After letters with embedded periods or single-letters there is no problem with using a +1 as in L +1 REV. However, a problem occurs if the text incorrectly uses a period after a complete word, such as "Akron. L. Rev.", in that case AKRON +1 L +1 REV would not retrieve the cite. Therefore it is a better practice to use a +2 after words longer than one character.
The combined-score is a composite of each journal's impact-factor and total cites count. The combined-score is, by default, weighted with approximately a third of the weight given to impact-factor and two-thirds given to total cites. The resulting score is then normalized.
The formula for obtaining the combined-score is the addition of the weighted and normalized scores for each of impact-factor (IF) and total cites (TC):
((IF x weight x 100)/highest-IF) + ((TC x (1-weight) x 100)/highest-TC)
The scores are then displayed in the combined-score column as a percentage of the largest score that exists in the retrieved set of journals. So the displayed version of the combined-score is calculated as:
(combined-score/highest-combined-score) x 100
Thus the top-ranked journal(s) in a retrieved set of journals will always have a displayed value of 100 and other journals will have lower numbers in proportion to their ranking calculation score.
Users may alter the default weight (0.33) by entering any decimal number between 0 and 1. If "0" is entered then the combined-score ranking will ignore impact-factor and produce a normalized ranking by total cites, and if "1" is entered then the combined-score ranking will ignore total cites and produce a normalized ranking by impact-factor. It's recommended that users not change the weighting value unless there's an interest in seeing a normalized ranking for either impact-factor or total cites.
Combined-score ranking is based on the idea proposed by Ronen Perry that neither ranking by total cites nor by impact-factor are in themselves sufficient, and need to be combined. See, Ronen Perry, The Relative Value of American Law Reviews: Refinement and Implementation, 39 Conn. L. Rev. 1 (2006), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=897063. The problem in any combined ranking is what weight to give to the underlying factors. Perry calculated a weight of 0.577 for impact-factor (and thus 0.433 for total cites) based on the idea that Harvard Law Review and Yale Law Journal have equal prestige, and 0.577 is the weight that makes the combined impact-factor and total-cites counts equal for these journals (over the survey period of 1998-2005). However, the default weighting used on the website is a value of 0.33. It was decided to use 0.33 because that weighting gives Harvard the highest combined-rank over each of 13 ranking surveys (1988-1995...2000-2007) (with the exception of the period 1991-1998 when Harvard was at 98 compared with Yale at 100). Use of 0.33 as the weighting between impact-factor and total cites also aims to maximize the average of Yale Law Journal's rank during those same years.
Note that combined-score rankings prior to 1996-2003 (i.e. 1988-1995, ... , 1995-2002) that were used in calculating the 0.33 weighting were calculated only for Harvard Law Review and Yale Law Journal and solely for the purpose of determining this weighting. For anyone interested, the data can be seen at http://lawlib.wlu.edu/LJ/index1995-2008.aspx
In order to more fairly compare new journals with established journals an adjustment to the combined-score is made for journals which, at the survey date, have been in existence for less than 8 years. For example a journal that began in 2007 will, for the 2009 ranking, have its total cites multiplied by 7.3 (that total cites extrapolation does not display in the total cites column - it is used only in the combined-score formula). The aim is to estimate from the cites to a journal over its few years of life how many cites the journal would likely have had, had it been in existence for at least 8 years. The multipliers are as follows (where the digit before the parenthical is the difference between the survey year and the year the journal began): 0(29) 1(29) 2(7.3) 3(3.4) 4(2.3) 5(1.6) 6(1.3).
These multipliers are based on a sample of 3 journals (American Law and Economics Review, Journal of Appellate Practice and Procedure, and Journal of Law and Family Studies) all of which began publication in 1999. The sample looked at how many cumulative cites occured 2 years, 3 years, etc. after publication, and what multiplier each year would have predicted the 2006 total. To stay well on the conservative side the lowest multiplier from the three journals was used. The lowest multiplier in this sample for a journal that was in its second year of publication was actually 43, but it was felt that this was too high a value for the volatile task of predicting from its 2 year total how many cites a journal would have after 8 years, and this value was arbitrarily reduced by 1/3 to a multiplier of 29. The same value was used if the journal is in its first year of publication (quite often there are no published cites to new journals in their first year of publication).
Impact-factor shows the average number of annual citations to articles in each journal (rounded to two decimal places). Impact-factor rankings should be used cautiously as they are biased against journals that publish a larger number of shorter articles, such as book reviews. Nevertheless, if two legal journals have a similar composition of articles, notes, and book reviews, then from an author's viewpoint it's reasonable to compare the impact-factor of each to see which is a better journal with which to publish. The implication of a similar ranking by total citations, but a dissimilar ranking by impact-factor is that the journal ranked lower by impact-factor is publishing some articles of lesser quality, or of less general interest. It's suggested that in preference to using impact-factor, the combined-score ranking (a weighting of both impact-factor and total cites) offers a more balanced view of journal ranking.
The method by which impact-factor is calculated is to conduct each of the Westlaw searches for citing articles in 8 separate yearly slices, the same search for each yearly slice, except that the added-date field is changed in each search like: AD(2002), AD(2003), ..., AD(2009). The number of citing articles from each yearly slice of additions to Westlaw are divided by the cumulative number of items that each yearly slice is likely to have cited. For example, if the survey period is 2002-2009 then the yearly slice of 2002 articles added to Westlaw will be citing 2002 articles, and the yearly slice of 2003 articles will be citing 2002 and 2003 articles, until we reach the 2009 yearly slice which will be citing articles from 2002 through 2009. Assuming each year that a journal steadily publishes 20 items, then if the number of citing articles from the 2003 slice is 30 then that year's impact-factor is 30/40=0.75. Should the number of citing articles in the 2009 slice be 100 then that year's impact-factor is 100/160=0.62. Then in order to throw out the less representative outliers the median of those values is recorded as the journal's impact-factor (usually for an 8 year publication range that will be the average of the two impact-factors closest to mid-range). In other words impact-factor is the median of the journal's annual impact-factors, and those annual IFs are calculated by taking the number of citing articles added to the JLR database in that year and dividing that by the number of items published by the journal in that year and any other year back to the beginning of the survey period.
The basic methodological difficulty is one of determining the number of articles published by each journal for the date period, there being no completely satisfactory and automated method for doing this. Most of the article quantity data (at least for the higher ranked journals) was obtained from the WilsonWeb Index to Legal Periodicals. "Articles" meaning any entry that ILP indexes, such as forewords, letters, notes, book reviews, as well as more traditional articles. Note that ILP has had different inclusion policies for its indexing over the years: up to the year 1999 items with less than 5 pages were not included. Then through the year 2002 items with less than 2 pages were not included, and in subsequent years items with less than half a page were not included. After the first preference of ILP, if it was necessary to check journals or volumes in other databases, then the next preference was Westlaw (if Westlaw comprehensively added articles for the years needed), followed by Lexis, then Legal Resource Index, then Legal Journals Index (UK), then any other index in which the journal was indexed. Often a manual count was made by physically examining the tables of contents for the journal years needed. In cases where indexing was not available and a manual count was not feasible, then an extrapolation was made from what was known. As these variant sources undoubtedly have differing definitions as to what is a countable entity this introduces variability into the counts.
Currency-factor aims to compare journals on how rapidly their articles become cited. It examines a three-year interval looking at how many articles in Westlaw's JLR database, made available during those three years, cite items published by a journal and dated during those same three years. Taking the example of the 2002-2009 survey period; currency-factor is the number of articles added to Westlaw's JLR database in the three-year period of 2002-2004 that cite to volumes of a journal dated during those same three years, divided by the number of items published by the journal during those same years. It would have been desirable to create this index from the final three years of the survey period, but the data on which it's based, being automatically created from annual data collected to calculate impact-factor, is in a form requiring the use of the first three years of each survey period. For any journal that began publication after the beginning of the survey period the three years will be the first three years of the journal's existence.
The Cites per Cost ranking is the average yearly number of cites to the journal divided by the annual US$ cost to U.S. academic libraries. So e.g., a journal with 600 cites per annum and costing $60 would show '10' in the Cites per Cost column.
Journals that are free are ignored for the purpose of this ranking. Strictly they should have an infinite score (cites/0) and be at the top of the ranking, however the purpose of this ranking is to present a cost-effective analysis for purchasing decisions, so ranking free journals seems to counter that purpose. "Average yearly number of cites" is the yearly average of cites as determined by the law journal rankings in the law journal submission information data, and its inclusion methodology (described further above) should be kept in mind. The year the journal began publication is also considered for recently started journals. So the actual calculation is:
YearsPublished = Survey-Year - YearJournalBegan + 1 (if > 8 Then = 8)
Cites per Cost = (Total-Cites / (AnnualCost * YearsPublished)) (rounded to two decimal places)