Preliminary comparison of the performance of the National Library of Medicine’s systematic review publication type and the sensitive clinical queries filter for systematic reviews in PubMed


  • Tamara Navarro-Ruan Research Coordinator, Health Information Research Unit, Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, Ontario, Canada
  • R. Brian Haynes Professor Emeritus, Health Information Research Unit, Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, Ontario, Canada



information retrieval, evidence-based medicine, systematic reviews


Objective: The National Library of Medicine (NLM) inaugurated a “publication type” concept to facilitate searches for systematic reviews (SRs). On the other hand, clinical queries (CQs) are validated search strategies designed to retrieve scientifically sound, clinically relevant original and review articles from biomedical literature databases. We compared the retrieval performance of the SR publication type (SR[pt]) against the most sensitive CQ for systematic review articles (CQrs) in PubMed.

Methods: We ran date-limited searches of SR[pt] and CQrs to compare the relative yield of articles and SRs, focusing on the differences in retrieval of SRs by SR[pt] but not CQrs (SR[pt] NOT CQrs) and CQrs NOT SR[pt]. Random samples of articles retrieved in each of these comparisons were examined for SRs until a consistent pattern became evident.

Results: For SR[pt] NOT CQrs, the yield was relatively low in quantity but rich in quality, with 79% of the articles being SRs. For CQrs NOT SR[pt], the yield was high in quantity but low in quality, with only 8% being SRs. For CQrs AND SR[pt], the quality was highest, with 92% being SRs.

Conclusions: We found that SR[pt] had high precision and specificity for SRs but low recall (sensitivity), whereas CQrs had much higher recall. SR[pt] OR CQrs added valid SRs to the CQrs yield at low cost (i.e., added few non-SRs). For searches that are intended to be exhaustive for SRs, SR[pt] can be added to existing sensitive search filters.


Murad MH, Asi N, Alsawas M, Alahdab F. New evidence pyramid. Evid Based Med. 2016 Aug;21(4):125–7. DOI: PMID: 27339128; PMCID: PMC4975798.

National Library of Medicine. Systematic review: MeSH descriptor data 2021 [Internet]. [cited 28 Jan 2021]. <>.

National Library of Medicine. Review: MeSH descriptor data 2021 [Internet]. [cited 28 Jan 2021]. <>.

Mork J, Aronson A, Demner-Fushman D. 12 years on - Is the NLM medical text indexer still useful and relevant? J Biomed Semantics. 2017 Feb 23;8(1):8. DOI: PMID: 28231809; PMCID: PMC5324252.

Del Fiol G, Michelson M, Iorio A, Cotoi C, Haynes RB. Deep learning method to automatically identify reports of scientifically rigorous clinical research from the biomedical literature. J Med Internet Res 2018;20(6): e10281. DOI:

Shojania KG, Sampson M, Ansari MT, Ji J, Doucette S, Moher D. How quickly do systematic reviews go out of date? A survival analysis. Ann Intern Med. 2007 Aug 21;147(4):224–33. DOI: PMID: 17638714.

Montori VM, Wilczynski NL, Morgan D, Haynes RB for the Hedges Team. Optimal search strategies for retrieving systematic reviews from MEDLINE: an analytical survey. BMJ. 2005;330:68–73.

Health Information Research Unit. Search filters for MEDLINE in Ovid syntax and the PubMed translation [Internet]. [cited 30 Aug 2021]. <>.

Wilczynski NL, McKibbon KA, Walter SD, Garg AX, Haynes RB. MEDLINE clinical queries are robust when searching in recent publishing years. J Am Med Inform Assoc. 2013;20:363–8. DOI:

Chalmers I, Glasziou P. Systematic reviews and research waste. Lancet. 2016 Jan 9;387(10014):122–3. DOI: PMID: 26841991.






Original Investigation