Case Report


SPI-Hub™: a gateway to scholarly publishing information


Taneya Y. Koonce, MSLS, MPH, Mallory N. Blasingame, MA, MSIS, Jerry Zhao, MS, MLIS, Annette M. Williams, MLS, Jing Su, MD, MS, Spencer J. DesAutels, MLIS, Dario A. Giuse, Dr. Ing., MS, FACMI, John D. Clark, MS, Zachary E. Fox, MSIS, Nunzia Bettinsoli Giuse, MD, MLS, FACMI, FMLA


doi: http://dx.doi.org/10.5195/jmla.2020.815

Received 01 August 2019: Accepted 01 October 2019


ABSTRACT

Background

Advances in the health sciences rely on sharing research and data through publication. As information professionals are often asked to contribute their knowledge to assist clinicians and researchers in selecting journals for publication, the authors recognized an opportunity to build a decision support tool, SPI-Hub: Scholarly Publishing Information Hub™, to capture the team’s collective publishing industry knowledge, while carefully retaining the quality of service.

Case Presentation

SPI-Hub’s decision support functionality relies on a data framework that describes journal publication policies and practices through a newly designed metadata structure, the Knowledge Management Journal Record™. Metadata fields are populated through a semi-automated process that uses custom programming to access content from multiple sources. Each record includes 25 metadata fields representing best publishing practices. Currently, the database includes more than 24,000 health sciences journal records. To correctly capture the resources needed for both completion and future maintenance of the project, the team conducted an internal study to assess time requirements for completing records through different stages of automation.

Conclusions

The journal decision support tool, SPI-Hub, provides an opportunity to assess publication practices by compiling data from a variety of sources in a single location. Automated and semi-automated approaches have effectively reduced the time needed for data collection. Through a comprehensive knowledge management framework and the incorporation of multiple quality points specific to each journal, SPI-Hub provides prospective users with both recommendations for publication and holistic assessment of the trustworthiness of journals in which to publish research and acquire trusted knowledge.

BACKGROUND

Journal trustworthiness and rigor have been much discussed since the transition to electronic publishing in the 1990s [1, 2]. The broadening of access to the literature, concomitant with the ability to read full-text articles online and on demand, brought an impetus to remove additional barriers posed by licensing and permissions restrictions through open access [3]. A recent analysis has found that open access publications constitute “at least 28% of the scholarly literature” across all disciplines [4], and many sponsors now require that authors make grant-supported findings openly available as a condition of funding to ensure barrier-free dissemination of research [510]. While the benefits of open access are significant [4, 11, 12], the opening of the academic publishing market has also led to unintended consequences.

The shift in the payment model, whereby funding often derives from author-paid article processing charges, has created an opportunity for players who are less driven by standards to earn a profit potentially using misleading or non-transparent practices [13, 14]. The exploitation of the pay-for-publication model threatens the integrity of the scientific communication process and confounds the ability to assess journal quality [1517]. A 2019 study reported that more than 150 systematic reviews and meta-analyses used content from a biomedical publisher against whom the Federal Trade Commission took legal action for deceptive business practices [18]. It is, thus, impossible to tell if ethical standards were maintained or peer review was performed. Researchers must be aware of journal assessment methods both to avoid using literature that is of unknown, possibly low, quality, as well as to avoid the risk of sending high-quality research to low-quality journals, which may result in reputational harm, low discoverability, and potential disappearance from the academic record [19, 20].

Long-established bibliographic databases—such as Ulrichsweb, OCLC’s WorldCat, and the National Library of Medicine (NLM) Catalog [2123]—offer objective, fact-based descriptive journal metadata. Such databases represent a journal at a single point in time, but unequivocal, current descriptive information requires consulting journal and/or publisher websites directly.

To assist authors in making decisions about where to publish, several organizations have issued guidance and checklists, including: Think. Check. Submit. [24], the Directory of Open Access Journals [25], and the World Association of Medical Editors algorithm [26]. Consistent among these assessment tools is an emphasis on considering multiple journal characteristics to assess a publication’s approach to editorial practices and commitment to transparency. Weighing multiple journal characteristics is key, as “one-stop” approaches to identifying whether a journal is “legitimate” have proved elusive and controversial [2729], and reliance on a single element such as the journal’s inclusion in a specific database (citation or full-text) can be misleading [30].

However, finding each component of information for multiple journals can be a time-consuming and daunting process. Several tools attempt to automate journal selection for potential authors and facilitate quick discovery of relevant journals for publication, based on scope and other elements such as impact metrics [3141]. In the authors’ experience, such tools often have one or more shortcoming: lack of details that specifically indicate a journal’s publication rigor and transparency as defined by the previously mentioned guidelines, journal inclusion limited to a specific publisher, commercial affiliations with companies specializing in manuscript preparation and editing services, availability via subscription services only, and subjective assessments of journal quality.

There is a clear challenge in evaluating quality and validity in the increasingly complex world of publishing. As has been our experience at the Center for Knowledge Management at Vanderbilt University Medical Center, information specialists are often called upon to assist clinicians and researchers in identifying where to publish their research and, in the process, steer them away from journals that lack rigor. As requests for assistance with journal investigation and selection have become more frequent, our team has recognized the need to scale the process through an innovative approach to automating its information specialists’ collective knowledge of the publishing industry. Making the process scalable allows us to provide this important service while freeing our team’s limited resources to address other challenges. Knowledge management approaches—such as knowledge curation, data organization, and content maintenance—can be directly applied to help guide clinicians and researchers through this dynamic landscape.

Our team’s experience applying knowledge management strategies to inform clinical decision support [42, 43] was foundational as we developed our decision support tool for identifying journals and evaluating their transparency and rigor: the SPI-Hub: Scholarly Publishing Information Hub™. The tool’s features and supporting metadata infrastructure, which we have coined the “Knowledge Management Journal Record™” [44], has facilitated the appraisal of journal transparency for informing authors’ decision-making about publication venues as well as for clinicians and researchers who need to critically evaluate journals [45].

CASE PRESENTATION

Knowledge Management Journal Record™

Our center’s approach to identifying where to publish manuscripts has traditionally involved manually capturing multiple data points to generate a snapshot of a journal’s scope and publishing policies. We have leveraged our manual process, combined with best practices from the aforementioned criteria and checklists, to define the metadata for the Knowledge Management Journal Record. The record focuses on multiple journal rigor and quality data points that can support informed decision-making for all journal types and avoids prescribing subjective assessments. The Knowledge Management Journal Record is intentionally designed to present users with impartial, current data about journals in as objective a manner as possible, allowing users to judge which elements of the journal record are most important to them.

The current fields in the Knowledge Management Journal Record have been selected based on a comprehensive review of publication guidelines and standards and the ease with which fields could be captured in a structured and/or semi-automated manner. The fields are organized in four sections: “General Information” (e.g., publication frequency, publication start year); “Metrics & Indexing” (e.g., MEDLINE indexing status); “Publication Policies” (e.g., Committee on Publication Ethics membership status, archiving status); and “Open Access” (e.g., Directory of Open Access Journals inclusion status, Creative Commons licenses offered). Supplemental Appendix A provides a detailed description, rationale, and data source for each of the twenty-five fields.

SPI-Hub: Scholarly Publishing Information Hub™ features

Leveraging the Knowledge Management Journal Record as its core infrastructure, we created SPI-Hub as a decision support tool for journal identification and assessment. SPI-Hub currently includes three primary functionalities: “Search by Topic,” “Search by Journal,” and “Search by Author.” Links to journal suggestion and selection resources that are publicly available are also provided in a “Resources” tab for prospective users to consult as a complement to SPI-Hub’s information. A “Contact Us” page allows users to provide feedback and suggest journals to add to the SPI-Hub database. The “Contact Us” page has a tab with Frequently Asked Questions (supplemental Appendix B), which provides answers to questions received from users’ feedback that we are working to address in the next version of SPI-Hub. This page will be updated on an ongoing basis as suggestions are implemented and we continue to expand and refine SPI-Hub.

Search by Topic

To help prospective authors find journals that publish works on a given topic, the tool offers a multifaceted search against PubMed. Using the Entrez Programming Utilities interface (E-utilities) provided by the National Center for Biotechnology Information (NCBI), we first recommend subject keywords from the PubMed autocompleter [46] for user search terms; free text keywords are also allowed. The NCBI E-utilities interface is used to retrieve PubMed citations and abstracts inclusive of all matching terms; search results are then aggregated at the journal level; and a ranked list of active journals publishing on the user’s submitted topic is generated.

The ranking uses a weighted algorithm that is largely informed by the team’s expertise and knowledge of authorship rules and regulations, publication standards, and the publishing industry. Results are displayed in two columns, with one column showing the default results ranking and a second column with citation-based impact metrics added to the weighting, because impact metrics are often used by authors as a criterion for journal selection but have known limitations as indicators of journal quality [4749]. Once results are returned, users can view the record of each journal or, alternatively, select up to three journals for a comparison view. Information buttons are included throughout the record to provide brief field definitions to aid users in assessing the information in any specific metadata field. Figures 1, 2, and 3 show an actual search for an informatics faculty member, recently conducted using SPI-Hub, to identify journals to submit a research article in the areas of “precision medicine,” “decision support,” “machine learning,” and “electronic health records.”

 

Example Search by Topic


 

Figure 1 Example Search by Topic

 

Journal results from Search by Topic option


 

Figure 2 Journal results from Search by Topic option

 

Example journal comparison screen


 

Figure 3 Example journal comparison screen

Search by Journal Name

The journal name search allows users to find detailed information about a specific journal. This function searches across all names and alternate titles of a journal in our database and utilizes an autocompleter for convenience. Upon a successful match, SPI-Hub retrieves the Knowledge Management Journal Record entry for the journal (supplemental Appendix C).

Search by Author

The author search functionality provides another mechanism for identifying potential publication venues by allowing the user to quickly view the scholarly journals in which colleagues working in similar areas of interest and research have published. The interface allows the user to input structured citation lists that may constitute an online curriculum vitae, including an ORCID identifier, an NCBI My Bibliography link, or the public uniform resource locator (URL) of a Zotero personal library or group [5052]. This selection was informed both by prevalence of use [5355] and by the provision of citation information via structured formats that can be parsed for comparisons to journals in SPI-Hub. Once journal matches are identified, the tool returns a list of hyperlinks to SPI-Hub records (supplemental Appendix D).

Data sources, automation, and verification

To initially populate the database, we downloaded the “List of All Journals Cited in PubMed” [56] and used the NCBI E-utilities to access the NLM Catalog to extract multiple data elements for each Knowledge Management Journal Record. For more comprehensive coverage, additional biomedical journals that were not indexed in PubMed were added, based on our team’s knowledge of publication venues, via email solicitations, online news articles, and social media postings, with emphasis on the publishers with the largest number of journals [57]. We supplemented the information downloaded from NLM by extracting data from multiple external sources via an application programming interface (API) and matching the data to SPI-Hub records (e.g., by comparing International Standard Serial Number [ISSN]) so that fields could be populated accordingly.

Based on patterns observed through manual completion of over 2,000 records, some fields are populated using information from publisher websites through semi-automated methods and manual review. When possible, publisher-level policies are established that apply to multiple journals. For example, a publisher may state that all of their journals undergo double-blind peer review, which enables us to create a standardized message for every journal produced by that publisher. Automated data accuracy and integrity checks are run periodically and used to update the database as new information becomes available from data sources.

Status of implementation

SPI-Hub includes approximately 24,000 currently active journal titles. Record completion is ongoing, with priority given to the journals in which our institution’s authors have most commonly published as well as the journals with the largest publishers, because those journals allow the record to be rapidly completed through automation of publisher-specific data. An internal study conducted to assess resource allocation for project completion and future maintenance, which detailed 4 distinct stages of automation, showed that the team was able each time to reduce the average time needed to fully complete journal records (supplemental Appendix E).

The last stage of automation, which is applicable to journals from large publishers, requires manual intervention only for 3 of the 25 metadata fields, showing a time reduction from 12.16 to 1.62 minutes to complete an entire record. With the introduction of automation, variances and errors were also greatly reduced, resulting in less time needed for quality control during the data acquisition phase of the project. To date, more than 16,100 records are fully completed in our database, with the remainder at least 50% complete.

DISCUSSION

SPI-Hub is a decision support application to aid researchers in identifying journals in which to publish and for reviewing journal transparency and rigor. A user can quickly review all information in a Knowledge Management Journal Record holistically to gain important insights into a single journal. By comparing two or more records, a user can identify and evaluate differences. For example, the integration into the record of data from different sources allows users to distinguish between a journal’s self-reported impact metrics and a verifiable JCR impact factor [5860] and identify any discrepancies, while also reporting, when applicable, the existence of other impact metrics such as SciMago journal rank or CiteScore. Other fields such as peer review and open access policies rely on a journal’s self-reported practices. While the tool accurately reflects the journal’s stated peer-review policy, other methods may be required to fully assess the quality of peer review [61]. Results of test searches (Figures 13) very closely represent the types of results and feedback that the team provides to users through a manual process, thus demonstrating the potential value SPI-Hub has in assisting information professionals.

Our automation study demonstrates the efficiency of the automated and semi-automated techniques applied to SPI-Hub. Importantly, it shows that in our current work flow, the time needed to complete Knowledge Management Journal Records can be considerably shorter than with a manual process. These automation findings have important implications, as the significant reduction of the manual effort improves the feasibility of keeping the information current. This will be key to the long-term sustainability of SPI-Hub.

As we further develop SPI-Hub, a robust maintenance strategy is also being implemented and will be key to the tool’s ongoing success and usefulness. This strategy includes periodic assessment of changes and updates to journal websites and third-party data sources alike. Maintenance, with all of its challenges, requires an ongoing, significant behind-the-scenes effort because journal details (e.g., publisher, publication frequency, indexing status in biomedical databases) change over time and sources change their data structure. For example, the Committee on Publication Ethics’s most recent change on how to search its membership data has already necessitated an update of our automated data gathering.

While SPI-Hub offers clear benefit as a decision support aid for journal identification and selection, there are two important limitations to report. First, while much progress has been made in the automation of data collection, more work is needed to fully automate the process. Second, at this stage, no effort has been made by the team to collect feedback from papers’ authors and peer reviewers about their experience working with specific journals or publishers, which could provide additional information about whether the journal or publisher actually adheres to its stated policies.

Our approach offers the benefit of an open and transparent process: SPI-Hub makes every effort to provide unbiased, factual information by which users can perform journal quality assessments. All stakeholders recognize how critical it is that researchers and individuals who are seeking journals in which to publish are able to assess their rigor and transparency. Through a comprehensive knowledge management framework and the incorporation of multiple quality points specific to each journal, SPI-Hub provides an opportunity for holistic assessment of the trustworthiness of journals in which to publish research and acquire trusted knowledge.

Through a series of planned rollouts, beta testing, and collection of anonymous user feedback (supplemental Appendix F), SPI-Hub has been under a process of constant improvement and refinement. Planned changes to the system, informed by this process, are being communicated to our users on an ongoing basis through a regularly updated Frequently Asked Questions page (supplemental Appendix B). At the conclusion of this phase of rapid refinement, SPI-Hub will be released to the general public and undergo a formal evaluation.

ACKNOWLEDGMENTS

The authors acknowledge Marcia Epelbaum, Elizabeth Frakes, AHIP, Sheila Kusnoor, Patricia Lee, and Helen Naylor for their valuable feedback and contributions to SPI-Hub data entry.

SUPPLEMENTAL FILES

Appendix AKnowledge Management Journal Record™ rationale and data sources
Appendix BSPI-Hub™ Frequently Asked Questions (FAQ) feature
Appendix CSample Knowledge Management Journal Record™
Appendix DSearch by Author functionality
Appendix EAutomation impact study
Appendix FSPI-Hub™ user evaluation

REFERENCES

1 Kircz JG. New practices for electronic publishing: how to maintain quality and guarantee integrity. Proceedings of the Second International Council for Science (ICSU)-United Nations Educational, Scientific and Cultural Organization (UNESCO) International Conference on Electronic Publishing in Science; UNESCO House, Paris; 20–23 Feb 2001. Amsterdam: International Council for Science; 20 Jun 2001. 8 p. (Available from: <http://eos.wdcb.ru/eps2/unesco.tex/pdf/kirczfin.pdf>. [cited 7 Feb 2020].)

2 Smith R. Electronic publishing in science. BMJ. 2001 Mar 17;322(7287):627–9. DOI: http://dx.doi.org/10.1136/bmj.322.7287.627.
cross-ref  pubmed  pmc  

3 Suber P. Open access. Cambridge, MA: MIT Press; 2012. (Available from: <https://mitpress.mit.edu/books/open-access>. [cited 7 Feb 2020].)
cross-ref  

4 Piwowar H, Priem J, Larivière V, Alperin JP, Matthias L, Norlander B, Farley A, West J, Haustein S. The state of OA: a large-scale analysis of the prevalence and impact of open access articles. PeerJ. 2018 Feb 13;6:e4375. DOI: http://dx.doi.org/10.7717/peerj.4375.
cross-ref  pubmed  pmc  

5 Science Europe. Plan S: making full and immediate open access a reality [Internet]. Brussels, Belgium: Science Europe; 2020 [cited 17 Oct 2019]. <https://www.coalition-s.org/>.

6 National Institutes of Health (NIH). NIH public access policy details [Internet]. Bethesda, MD: The Institutes [rev. 25 Mar 2016; cited 17 Oct 2019]. <https://publicaccess.nih.gov/policy.htm>.

7 Research Councils UK. RCUK policy on open access and supporting guidance [Internet]. Swindon, UK: The Councils; 8 Apr 2013 [cited 17 Oct 2019]. <https://www.ukri.org/files/legacy/documents/rcukopenaccesspolicy-pdf/>.

8 Wellcome Trust. Open access policy [Internet]. London, UK: The Trust; [cited 17 Oct 2019]. <https://wellcome.ac.uk/funding/guidance/open-access-policy>.

9 Bill & Melinda Gates Foundation. Bill & Melinda Gates Foundation open access policy [Internet]. Seattle, WA: The Foundation [cited 17 Oct 2019]. <https://www.gatesfoundation.org/How-We-Work/General-Information/Open-Access-Policy>.

10 National Science Foundation. Today’s data, tomorrow’s discoveries: increasing access to the results of research funded by the National Science Foundation [Internet]. Alexandria, VA: 18 Mar 2015 [cited 17 Oct 2019]. <https://www.nsf.gov/pubs/2015/nsf15052/nsf15052.pdf>.

11 Lawrence S. Free online availability substantially increases a paper’s impact. Nature. 2001 May 31;411(6837):521. DOI: http://dx.doi.org/10.1038/35079151.
cross-ref  pubmed  

12 McCabe MJ, Snyder CM. Identifying the effect of open access on citations using a panel of science journals. Econ Inq. 2014 Oct;52(4):1284–300. DOI: http://dx.doi.org/10.1111/ecin.12064.
cross-ref  

13 Kolata G. The price for ‘predatory’ publishing? $50 million [Internet]. N Y Times. 2019 Apr 3:Science [cited 17 Oct 2019]. <https://www.nytimes.com/2019/04/03/science/predatory-journals-ftc-omics.html>.

14 Cobey KD, Grudniewicz A, Lalu MM, Rice DB, Raffoul H, Moher D. Knowledge and motivations of researchers publishing in presumed predatory journals: a survey. BMJ Open. 2019 Mar 23;9(3):e026516. DOI: http://dx.doi.org/10.1136/bmjopen-2018-026516.
cross-ref  pubmed  pmc  

15 Bohannon J. Who’s afraid of peer review? Science. 2013 Oct 4;342(6154):60–5. DOI: http://dx.doi.org/10.1126/science.342.6154.60.
cross-ref  pubmed  

16 Moher D, Shamseer L, Cobey KD, Lalu MM, Galipeau J, Avey MT, Ahmadzai N, Alabousi M, Barbeau P, Beck A, Daniel R, Frank R, Ghannad M, Hamel C, Hersi M, Hutton B, Isupov I, McGrath TA, McInnes MDF, Page MJ, Pratt M, Pussegoda K, Shea B, Srivastava A, Stevens A, Thavorn K, van Katwyk S, Ward R, Wolfe D, Yazdi F, Yu AM, Ziai H. Stop this waste of people, animals and money. Nature. 2017 Sep 7;549(7670):23–5. DOI: http://dx.doi.org/10.1038/549023a.
cross-ref  pubmed  

17 Cobey KD, Lalu MM, Skidmore B, Ahmadzai N, Grudniewicz A, Moher D. What is a predatory journal? a scoping review. F1000Res. 2018 Jul 4;7:1001 [rev. 2018 Jan 1]. DOI: http://dx.doi.org/10.12688/f1000research.15256.2.
cross-ref  pubmed  pmc  

18 Ross-White A, Godfrey CM, Sears KA, Wilson R. Predatory publications in evidence syntheses. J Med Libr Assoc. 2019 Jan;107(1):57–61. DOI: http://dx.doi.org/10.5195/jmla.2019.491.
cross-ref  pubmed  pmc  

19 Walters DJ, ed. Urgent action needed to preserve scholarly electronic journals [Internet]. New York, NY: Andrew W. Mellon Foundation and Association of Research Libraries; 13 Sep 2005 [cited 17 Oct 2019]. <https://www.arl.org/resources/urgent-action-needed-to-preserve-scholarly-electronic-journals/>.

20 Burnhill P. Tales from the Keepers Registry: serial issues about archiving & the web. Serials Rev. 2013 Dec;39(1):3–20. DOI: http://dx.doi.org/10.1080/00987913.2013.10765481.
cross-ref  

21 ProQuest. Ulrichsweb global serials directory [Internet]. Ann Arbor, MI: ProQuest; 2019 [cited 17 Oct 2019]. <http://ulrichsweb.serialssolutions.com>.

22 OCLC. WorldCat [Internet]. Dublin, OH: OCLC; 2001–2020 [cited 17 Oct 2019]. <https://www.worldcat.org/>.

23 US National Library of Medicine. NLM catalog [Internet]. Bethesda, MD: National Center for Biotechnology Information, The Library [cited 17 Oct 2019]. <https://www.ncbi.nlm.nih.gov/nlmcatalog/>.

24 Think. Check. Submit. [Internet]. Think. Check. Submit.; 2019 [cited 17 Oct 2019]. <https://thinkchecksubmit.org/>.

25 Directory of Open Access Journals (DOAJ). Information for publishers [Internet]. The Directory [cited 17 Oct 2019]. <https://doaj.org>.

26 Laine C, Winker MA. Identifying predatory or pseudo-journals. Biochem Med (Zagreb). 2017 Jun 15;27(2):285–91. DOI: http://dx.doi.org/10.11613/BM.2017.031.
cross-ref  

27 Basken P. Why Beall’s List died–and what it left unresolved about open access [Internet]. Chron Higher Educ. 12 Sep 2017 [updated 13 Sep 2017; cited 17 Oct 2019]. <https://www.chronicle.com/article/Why-Beall-s-List-Died-/241171>.

28 Beall J. What I learned from predatory publishers. Biochem Med (Zagreb). 2017 Jun 15;27(2):273–8. DOI: http://dx.doi.org/10.11613/BM.2017.029.
cross-ref  

29 Malički M, Aalbersberg IJ, Bouter L, Ter Riet G. Journals’ instructions to authors: a cross-sectional study across scientific disciplines. PLoS One. 2019 Sep 5;14(9):e0222157. DOI: http://dx.doi.org/10.1371/journal.pone.0222157.
cross-ref  

30 Manca A, Moher D, Cugusi L, Dvir Z, Deriu F. How predatory journals leak into PubMed. CMAJ. 2018 Sep 4;190(35):E1042–E1045. DOI: http://dx.doi.org/10.1503/cmaj.180154.
cross-ref  pubmed  pmc  

31 Cabells. Cabells scholarly analytics [Internet]. Beaumont, TX: Cabells International [cited 17 Oct 2019]. <https://www2.cabells.com/>.

32 Cofactor journal selector [Internet]. London, UK: Cofactor; 2019 [cited 17 Oct 2019]. <http://cofactorscience.com/journal-selector>.

33 Edanz Group. Edanz journal selector [Internet]. Fukuoka, Japan: Edanz Group Japan; 2019 [cited 17 Oct 2019]. <https://www.edanzediting.com/journal-selector>.

34 Elsevier. Journal finder [Internet]. Amsterdam, Netherlands: Elsevier; 2019 [cited 17 Oct 2019]. <https://journalfinder.elsevier.com/>.

35 IEEE. Publication recommender [Internet]. New York, NY: IEEE; 2018 [cited 17 Oct 2019]. <http://publication-recommender.ieee.org/home>.

36 Research Square. JournalGuide [Internet]. Durham, NC: Research Square; 2014 [cited 17 Oct 2019]. <https://www.journalguide.com/>.

37 Clarivate. Manuscript matcher [Internet]. Philadelphia, PA: Clarivate; 2019 [cited 17 Oct 2019]. <https://endnote.com/product-details/manuscript-matcher/>.

38 Clarivate Analytics. Master journal list beta [Internet]. Philadelphia, PA: Clarivate Analytics; 2019 [cited 17 Oct 2019]. <https://apps.clarivate.com/mjl-beta/home>.

39 University of Washington. DataLab publish & flourish [Internet]. Seattle, WA: The DataLab [updated 5 Sep 2017; cited 17 Oct 2019]. <http://flourishoa.org/>.

40 Schuemie MJ, Kors JA. JANE: suggesting journals, finding experts. Bioinformatics. 2008 Mar 1;24(5):727–8. DOI: http://dx.doi.org/10.1093/bioinformatics/btn006.
cross-ref  pubmed  

41 SpringerNature. Journal suggester [Internet]. New York, NY: SpringerNature; 2018 [cited 17 Oct 2019]. <https://journalsuggester.springer.com/>.

42 DesAutels SJ, Zutter MM, Williams AW, Giuse DA, Fox ZE, Youngblood M, Stead WW, Giuse NB. Integrating evidence to inform lab test selection into a knowledge management framework. Paper presented at: American Medical Informatics Association (AMIA) Clinical Informatics Conference; Scottsdale, AZ; May 2018.

43 DesAutels SJ, Fox ZE, Giuse DA, Williams AM, Kou QH, Weitkamp A, Patel NR, Giuse NB. Using best practices to extract, organize, and reuse embedded decision support content knowledge rules from mature clinical systems. AMIA Annu Symp Proc. 2017 Feb 10;2016:504–13. (Available from: <https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5333252/>. [cited 6 Feb 2020].)
pubmed  pmc  

44 Giuse, Nunzia B. (Center for Knowledge Management, Strategy & Innovation, Vanderbilt University Medical Center, Nashville, TN). Conversation with: Taneya Koonce and Annette Williams (Center for Knowledge Management, Strategy & Innovation, Vanderbilt University Medical Center, Nashville, TN). 7 Mar 2019.

45 Koonce TY, Blasingame MN, Williams AM, Zhao J, Fox ZE, Frakes ET, DesAutels SJ, Clark JD, Giuse DA, Giuse NB. Transparency in publishing: how to best inform the journal selection process. Presented at: MLA ’19, the 119th Medical Library Association Annual Meeting; Chicago, IL; 6 May 2019.

46 Lu Z, Wilbur WJ, McEntyre JR, Iskhakov A, Szilagyi L. Finding query suggestions for PubMed. AMIA Annu Symp Proc. 2009 Nov 14;2009:396–400. (Available from: <https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2815412>. [cited 6 Feb 2020].)
pubmed  pmc  

47 BMC. Find the right journal [Internet]. London, UK: BioMed Central [cited 17 Oct 2019]. <https://www.biomedcentral.com/getpublished/find-the-right-journal>.

48 Garfield E. The history and meaning of the journal impact factor. JAMA. 2006 Jan 4;295(1):90–3. DOI: http://dx.doi.org/10.1001/jama.295.1.90.
cross-ref  pubmed  

49 Roldan-Valadez E, Salazar-Ruiz SY, Ibarra-Contreras R, Rios C. Current concepts on bibliometrics: a brief review about impact factor, Eigenfactor score, CiteScore, SCImago journal rank, source-normalised impact per paper, H-index, and alternative metrics. Ir J Med Sci. 2019 Aug;188(3):939–51. DOI: http://dx.doi.org/10.1007/s11845-018-1936-5.
cross-ref  

50 ORCID. Our mission [Internet]. ORCID [cited 17 Oct 2019]. <https://orcid.org/about/what-is-orcid/mission>.

51 National Center for Biotechnology Information. My NCBI help: my bibliography [Internet]. Bethesda, MD: The Center [updated May 2019; cited 17 Oct 2019]. <https://www.ncbi.nlm.nih.gov/books/NBK53595/>.

52 Zotero. Documentation [Internet]. Vienna, VA: Corporation for Digital Scholarship [updated 28 Aug 2018; cited 17 Oct 2019]. <https://www.zotero.org/support/start>.

53 Gasparyan AY, Nurmashev B, Yessirkepov M, Endovitskiy DA, Voronov AA, Kitas GD. Researcher and author profiles: opportunities, advantages, and limitations. J Korean Med Sci. 2017 Nov;32(11):1749–56. DOI: http://dx.doi.org/10.3346/jkms.2017.32.11.1749.
cross-ref  pubmed  pmc  

54 National Institutes of Health, Agency for Healthcare Research and Quality, Centers for Disease Control and Prevention. Requirement for ORCID iDs for individuals supported by research training, fellowship, research education, and career development awards beginning in FY 2020 [Internet]. Bethesda, MD: The Institutes, Office of Extramural Research; 10 Jul 2019 [cited 17 Oct 2019]. Notice no.: NOT-OD-19-109. <https://grants.nih.gov/grants/guide/notice-files/NOT-OD-19-109.html>.

55 National Center for Biotechnology Information. My NCBI help: SciENcv [Internet]. Bethesda, MD: The Center; 12 Aug 2013 [updated 26 Jun 2019; cited 17 Oct 2019]. <https://www.ncbi.nlm.nih.gov/books/NBK154494/>.

56 National Library of Medicine. List of all journals cited in PubMed [Internet]. Bethesda, MD: The Library [updated 3 May 2018; cited 17 Oct 2019]. <https://www.nlm.nih.gov/bsd/serfile_addedinfo.html>.

57 Norddeutscher Rundfunk (NDR), Westdeutscher Rundfunk (WDR), Süddeutsche Zeitung. More than 5000 German scientists have published papers in pseudo-scientific journals. Hamburg, Germany: Norddeutscher Rundfunk; 19 Jul 2018 [cited 17 Oct 2019]. <https://www.ndr.de/der_ndr/presse/More-than-5000-German-scientists-have-published-papers-in-pseudo-scientific-journals,fakescience178.html>.

58 Shamseer L, Moher D, Maduekwe O, Turner L, Barbour V, Burch R, Clark J, Galipeau J, Roberts J, Shea BJ. Potential predatory and legitimate biomedical journals: can you tell the difference? a cross-sectional comparison. BMC Med. 2017 Mar 16;15(1):28. DOI: http://dx.doi.org/10.1186/s12916-017-0785-9.
cross-ref  pubmed  pmc  

59 Beaubien S, Eckard M. Addressing faculty publishing concerns with open access journal quality indicators. J Librariansh Sch Commun. 2014 Apr 10;2(2):eP1133. DOI: http://dx.doi.org/10.7710/2162-3309.1133.
cross-ref  

60 Shen C, Björk BC. ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics. BMC Med. 2015 Oct 1;13:230. DOI: http://dx.doi.org/10.1186/s12916-015-0469-2.
cross-ref  

61 Woolston C. Unravelling the mysteries of preprints and peer review. Nature. 18 Jun 2019. DOI: http://dx.doi.org/10.1038/d41586-019-01947-4. Correction: Nature. 24 Jun 2019.
cross-ref  


Taneya Y. Koonce

Taneya Y. Koonce, MSLS, MPH, taneya.koonce@vumc.org, https://orcid.org/0000-0002-4014-467X, Associate Director for Research, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Mallory N. Blasingame, MA, MSIS, mallory.n.blasingame@vumc.org, https://orcid.org/0000-0003-0356-9481, Information Scientist, Center for Knowledge Management, Strategy and Innovation Vanderbilt University Medical Center, Nashville, TN

Jerry Zhao, MS, MLIS, jerry.zhao@vumc.org, Senior Application Developer, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Annette M. Williams, MLS, annette.williams@vumc.org, https://orcid.org/0000-0002-2526-3857, Senior Information Scientist, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Jing Su, MD, MS, jing.su@vumc.org, Information Scientist, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Spencer J. DesAutels, MLIS, spencer.desautels@vumc.org, https://orcid.org/0000-0002-6120-2496, Information Scientist, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Dario A. Giuse, Dr. Ing., MS, FACMI, dario.giuse@vumc.org, Associate Professor, Department of Biomedical Informatics, Vanderbilt University School of Medicine and Vanderbilt University Medical Center, Nashville, TN

John D. Clark, MS, john.clark@vumc.org, Senior Application Developer, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Zachary E. Fox, MSIS, zachary.e.fox@vumc.org, Associate Director for Information Services, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

Nunzia Bettinsoli Giuse, MD, MLS, FACMI, FMLA, nunzia.giuse@vanderbilt.edu, https://orcid.org/0000-0002-7644-9803, Professor of Biomedical Informatics and Professor of Medicine; Vice President for Knowledge Management; and Director, Center for Knowledge Management, Strategy and Innovation, Vanderbilt University Medical Center, Nashville, TN

(Return to Top)


Articles in this journal are licensed under a Creative Commons Attribution 4.0 International License.

This journal is published by the University Library System of the University of Pittsburgh as part of its D-Scribe Digital Publishing Program and is cosponsored by the University of Pittsburgh Press.


Journal of the Medical Library Association, VOLUME 108, NUMBER 2, April 2020