Jocelyn Huang Schiller, MD, Gary L. Beck Dallaghan, PhD, Terry Kind, MD, MPH, Heather McLauchlan, MD, Joseph Gigante, MD, Sherilyn Smith, MDdoi: http://dx.doi.org/10.5195/jmla.2017.134
Received December 2016: Accepted April 2017
Multi-institutional research increases the generalizability of research findings. However, little is known about characteristics of collaborations across institutions in health sciences education research. Using a systematic review process, the authors describe characteristics of published, peer-reviewed multi-institutional health sciences education research to inform educators who are considering such projects.
Two medical librarians searched MEDLINE, the Education Resources Information Center (ERIC), EMBASE, and CINAHL databases for English-language studies published between 2004 and 2013 using keyword terms related to multi-institutional systems and health sciences education. Teams of two authors reviewed each study and resolved coding discrepancies through consensus. Collected data points included funding, research network involvement, author characteristics, learner characteristics, and research methods. Data were analyzed using descriptive statistics.
One hundred eighteen of 310 articles met inclusion criteria. Sixty-three (53%) studies received external and/or internal financial support (87% listed external funding, 37% listed internal funding). Forty-five funded studies involved graduate medical education programs. Twenty (17%) studies involved a research or education network. Eighty-five (89%) publications listed an author with a master’s degree or doctoral degree. Ninety-two (78%) studies were descriptive, whereas 26 studies (22%) were experimental. The reported study outcomes were changes in student attitude (38%; n=44), knowledge (26%; n=31), or skill assessment (23%; n=27), as well as patient outcomes (9%; n=11).
Multi-institutional descriptive studies reporting knowledge or attitude outcomes are highly published. Our findings indicate that funding resources are not essential to successfully undertake multi-institutional projects. Funded studies were more likely to originate from graduate medical or nursing programs.
Health sciences educators seek evidence-based teaching approaches to optimize learning outcomes . Stakeholders in education, however, have maintained that the quality of health sciences education research is inadequate [2, 3]. In response to this critique, journal editors and education researchers expect greater methodological rigor, larger sample sizes, and more meaningful outcomes [4–8]. Applicability is increased when studies are generalizable beyond a given teacher, learner, or setting. In health care, multi-institutional research is the cornerstone of clinical trials of treatment and diagnostic innovations for patient care. Because of the larger sample size and more diverse population in multi-institutional clinical research, results may be generalized to a broader population. Therefore, to enhance the generalizability of health sciences education studies and broaden impact, individuals and institutions should also collaborate and conduct multi-institutional health sciences education research [9, 10].
A few publications provide general tips for conducting collaborative research in medical education and include advice on planning, implementation, and dissemination of outcomes [9, 11, 12]. Although helpful, these suggestions are based on the authors’ experiences and are not linked to publication data . In addition, to our knowledge, the types of studies that are most amenable to multi-institutional education research, characteristics of the authors, and the level of support needed for multi-institutional health sciences education research have not been described.
In this systematic review, our initial objective was to collect data on published multi-institutional medical education research to identify common characteristics of these collaborative projects. After consultation with library experts, we used additional search terms, which broadened the scope to capture other health sciences professions publications. Through this review, we sought to inform educators about attributes of published peer-reviewed, multi-institutional health sciences education research as they undertake such projects.
This review was planned and conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines .
We included English-language empirical studies from 2004 to 2013 with participants who were in undergraduate or graduate health sciences training programs. Studies were included if they reported educational outcomes (i.e., attitudes, knowledge, or skills) or changes in patient outcomes and were conducted at more than one institution. Publications were excluded if they solely involved faculty development or continuing medical education for practicing professionals. Publications were also excluded if they involved a single training program, even if the trainees rotated at multiple hospital systems.
MEDLINE, the Education Resources Information Center (ERIC), EMBASE, and CINAHL databases were searched.
We searched for studies using search terms and key words related to (1) multi-institutional; (2) medical education, medical students, graduate medical education, allied health, health occupations, or nursing students; and (3) teaching, education, curriculum, competency, or simulation. Two research librarians independently developed the search criteria with similar results. The search was conducted by Vanderbilt University. The supplemental appendix provides the complete search strategies for each database.
“Health science education research” was defined as any original research study pertaining to health professional students or postgraduate residents and fellows in medicine, nursing, dentistry, or pharmacy. We defined “multi-institutional” as any project that included participants from more than one school or institution. “Original research” was defined as an educational intervention or trial, curriculum evaluation with subjective or objective outcomes, or evaluation of an educational instrument or tool. We included studies that were qualitative and/or quantitative with descriptive and/or experimental research methodologies.
Initially, 469 records were identified (Figure 1). Duplicates were removed, and results were limited to the years 2004–2013, yielding 310 remaining records. These 310 studies were divided among pairs of researchers who independently reviewed the studies’ abstracts to determine whether the publication met our definition of multi-institutional health sciences educational research. This resulted in 131 studies for full review.
Figure 1 Flowchart of literature search and study selection for multi-institutional health sciences education research
We developed and piloted a standardized data abstraction form in Microsoft Excel to document the number of learners in the study, learner level of training, learner field of study, number of institutions of authors and learners, number of authors, author degrees, institutional nationality of learners, external and internal research funding, research methods, and research or educational network involvement. In addition, author affiliations were examined to identify whether a researcher in a department of education participated in the study. To pilot the data abstraction form, each investigator reviewed five publications. A conference call was held to reach consensus on the results as well as to better define categories for consistency of data extraction.
Three pairs of authors then reviewed the remaining 131 articles (divided per pair). Discrepancies in coding were resolved by each pair of authors or brought to the larger research team for consensus. After full review, an additional 13 articles did not meet the initial inclusion criteria, leaving 118 publications for final analysis. One pair of authors reviewed all publications to determine study type (experimental or descriptive) and outcomes. If multiple outcomes were examined, studies were categorized according to the “highest” domain of educational activity, assessed using a modification of Kirkpatrick’s model: (1) learner reaction and attitude, (2) acquisition of knowledge, (3) demonstration of skill, and (4) changes in patient care [15, 16].
Descriptive and correlational statistics were used to summarize the findings of the systematic review using IBM SPSS, version 22, software.
The average number of learners across studies was 379 (median 188; range 4–4,300) from an average of 8.4 institutions (median 5; range 2–73). Most studies included participants in graduate medical education programs (60%; n=71), followed by medical students (38%; n=45) and nursing students (11%; n=13). Sixteen percent (n=19) of studies included multiple levels of learners, but only 2 were interdisciplinary, with the remainder involving a combination of medical students and residents or undergraduate and graduate nursing students. Of the 89 studies involving medical students or residents, 44% (n=39) involved surgical and 22% (n=20) involved internal medicine departments.
Most learners were from institutions in the United States (69%; n=82), followed by Canada (7%; n=8). The publications in this analysis included learners from 26 countries; 8% (n=9) included learners from more than 1 country.
The median number of authors was 7 (range 1–20). Of studies published in journals noting author degrees (81%; n=96), the majority listed an author with a master’s degree or doctorate (PhD) (89%; n=85). A minority of publications noted an author in a Department of Medical Education (18%; n=21) or Department of Biostatistics or Epidemiology (25%; n=29). Seventeen percent (n=20) of studies acknowledged an affiliation with a network, association, registry, or study group.
Study types were heterogeneous. Twenty-six (n=22%) studies used an experimental design, and 16 of these studies randomized learners to different conditions (14% of all studies). Most studies were descriptive (78%; n=92). The study outcomes reported were changes in student attitude (38%; n=44), knowledge (26%; n=31), and skill assessment (23%; n=27), as well as patient outcomes (9%; n=11). Figure 2 shows the distribution of study types and outcome measures of the reviewed publications. Most studies (82%; n=97) used quantitative methods, with the remainder using qualitative methods (10%; n=12) or a mixed methods approach (8%; n=9).
Figure 2 Distribution of study type and outcome measures of the reviewed publications (%)
Fifty-three percent (n=63) of publications acknowledged funding, with a steady increase in the frequency of funding over time (Figure 3). As the number of multi-institutional health sciences education publications increased, there was a correlating rise in the number of funded studies (r=0.919; p<0.001). Of these funded projects, 87% (n=55) received external funding, and 37% (n=23) received internal funding. Of the funded studies, 71% (n=45) were from graduate medical education programs, 41% (n=26) of which were conducted in surgical specialties and 36% (n=22) of which were conducted in primary care specialties (internal medicine, pediatrics, psychiatry, family medicine).
Figure 3 Funding trends of the reviewed publications
We conducted a structured review of multi-institutional undergraduate and graduate health sciences literature from the past decade to identify characteristics of collaborative projects. Our results indicate that multi-institutional educational research can be successfully carried out and published with limited infrastructure support, but they also point out important opportunities for future work. In our analysis, just over half the studies reported funding, and fewer than 20% reported involvement in a network or collaborative organization.
Funding is thought to enhance health sciences education research by facilitating support of rigorous study designs through multi-institutional collaboration . Multi-institutional collaboration necessitates deliberate, prospective research designs in order to investigate interventions of comparative settings . Although rigor based on Reed and colleagues’ recommendations  can lead to improved funding rates, our findings indicate that studies using a variety of study methodologies were also funded. Fifty-three percent of multi-institutional health sciences education research studies in our review reported funding, similar to rates reported in prior studies [18–20].
We found that 45 (75%) funded studies involved residency training programs, 26 of which were from surgical subspecialties. This might be the result of efforts by the American College of Surgeons to support regional simulation-based education , which helped residency training programs undertake multi-institutional research related to skills development in surgical residency programs. With funding and technical training as common bridges across institutions , residency programs were more likely to be able to support multi-institutional studies due to common requirements through the Accreditation Council for Graduate Medical Education . The American Association of Colleges of Nursing have similar “essential” guidelines for all nursing programs , which might explain why multiple nursing studies appeared in our search. Since medical student education can be starkly different across institutions in an era of curricular innovation, congruence of specific disciplinary focus  may limit the ability to conduct a rigorous, multi-institutional research study.
Transforming educational activities into high-quality scholarship that advances the field requires methodological skills and resources [9, 11, 25, 26], but most health sciences faculty are not trained in educational or other social scientific research . Investigators with educational research expertise can provide valuable resources to support educational scholarship. The majority of publications included in our study listed an author with a master’s degree or PhD, suggesting advanced training in research methodology. Of funded studies, 64% had authors with such degrees. Due to differences in the publication style of various journals, it was unclear how many authors were from a Department of Medical Education or Department of Biostatistics or Epidemiology or were involved with a research network. Because some journals did not note the credentials of their authors or affiliations, it was possible that the true numbers of authors with advanced degrees or in these departments were higher. Because lack of research expertise has been identified as a major barrier to health sciences education research [26, 28, 29], multi-institutional collaborations and research networks may provide the support needed to overcome these obstacles.
Faculty undertaking future educational research should identify potential resources in their institutions and effectively leverage national programs that support skills development and collaboration [30, 31]. National organizations should continue to invest in infrastructure to support research networks and anticipate the financial needs of their ongoing maintenance and growth . We did find an increase in the number of studies reporting funding over the ten-year time span of this project, perhaps reflecting the acknowledgement of prior calls for increased funding for health sciences education research funding [18, 19, 33, 34].
A minority of the reviewed studies employed experimental methods, consistent with previous findings [19, 35], and fewer than 10% of studies measured patient outcomes despite repeated calls for this focus . Accountability, safety, and quality are pressing needs in health care. Health sciences education research, and more specifically medical education research, must develop rigorous, generalizable outcome measures that guide curricular change to improve the health of patients . Guidance exists for educational researchers to address these quality gaps, which can provide a foundation for designing future studies . Multi-institutional studies, while resource intensive, can add to this effort by improving generalizability of findings. In addition, collaborative research may help facilitate health sciences education researchers and patient outcomes researchers to leverage their skills.
Several limitations of this study should be considered. Though our analysis was limited to a ten-year period and included studies available in the English language only, we included the most recent available decade and included studies from around the world. In using “AND” as well as “health occupations” in the search strategy, we may have excluded some studies in professions outside of medicine. However, as our initial objective was to study medical education, we believe the inclusion of other health sciences in this study has broader appeal. Future investigations should specifically include other health professions by name to draw an even more comprehensive picture. We defined success as publication and described characteristics of published studies but did not include or describe characteristics of unpublished multi-institutional studies. We also did not examine single-institutional studies for comparison.
In this systematic review, we describe the current state of multi-institutional health sciences education publications to assist educators who are planning to undertake such collaborative projects. Collaboration can assist in planning for resources, developing research networks, and creating infrastructure whether regionally, nationally, or internationally within or across specialty societies. Most study teams collaborated with a team member with a PhD or master’s degree, and more than half had funding for their research. Our results indicate that multi-institutional educational research can be successfully carried out and published with limited infrastructure support. Financial and educational resources to foster and support collaborative educational research may be helpful to promote future high-quality multi-institutional medical education research.
Appendix Complete search strategies for each database
The authors thank Katy Justiss, librarian at Vanderbilt Medical School, and Alissa Fial, education and research services librarian at the McGoogan Library of Medicine, University of Nebraska Medical Center, for their assistance with the literature search. The authors also thank the University of Nebraska Medical Center Department of Pediatrics Writing Group for their thoughtful review and recommendations for this manuscript.
The abstract was presented at the Council on Medical Student Education in Pediatrics Annual Meeting; New Orleans, Louisiana; 12 March 2015.
1. Archer J, McManus C, Woolf K, Monrouxe L, Illing J, Bullock A, Roberts T. Without proper research funding, how can medical education be evidence based? BMJ. 2015 Jun 26;350:h3445.
2. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007 Sep 5;298(9):1002–9.
3. Broome ME, Ironside PM, McNelis AM. Research in nursing education: state of the science. J Nurs Educ. 2012 Sep;51(9):521–4.
4. Dauphinee WD, Wood-Dauphinee S. The need for evidence in medical education: the development of best evidence medical education as an opportunity to inform, guide, and sustain medical education research. Acad Med. 2004 Oct;79(10):925–30.
5. Chen FM, Bacuhner H, Burstin H. A call for outcomes research in medical education. Acad Med. 2004 Oct;79(10):955–60.
6. Wartman SA. Research in medical education: the challenge for the next decade. Acad Med. 1994 Aug;69(8):608–14.
7. Carney PA, Nierenberg DW, Pipas CF, Brooks W, Stukel TA, Keller AM. Educational epidemiology: applying population-based design and analytic approaches to study medical education. JAMA. 2004 Sep 1;292(9):1044–50.
8. Shea JA, Arnold L, Mann KVA. RIME perspective on the quality and relevance of current and future medical education research. Acad Med. 2004 Oct;79(10):931–8.
9. Society of Directors of Research in Medical Education. Guidelines for multi-institutional/collaborative research. Acad Med. 2015 Mar;90(3):394.
10. O’Sullivan PS, Stoddard HA, Kalishman S. Collaborative research in medical education: a discussion of theory and practice. Med Educ. 2010 Dec;44(12):1175–84.
11. Huggett KN, Gusic ME, Greengerg R, Ketterer JM. Twelve tips for conducting collaborative research in medical education. Med Teach. 2011;33(9):713–8.
12. Beischel KP, Hart J, Turkelson SL. Conducting a multisite education research project: strategies to overcome the barriers to achieve the benefits. Nurse Educ. 2016 Jul–Aug;41(4):204–7.
13. North S, Giddens J. Lessons learned in multisite, nursing education research while studying a technology learning innovation. J Nurs Educ. 2013 Oct;52(10):567–73.
14. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med. 2009 Aug 18;151(4):W65–94.
15. Kirkpatrick DL. Evaluation of training. In: Craig RL, Bittel LF, eds. Training and development handbook. New York, NY: McGraw-Hill; 1967. p. 87–112.
16. Onyura B, Baker L, Cameron B, Friesen F, Leslie K. Evidence for curricular and instructional design approaches in undergraduate medical education: an umbrella review. Med Teach. 2016;38(2):150–61.
17. Gruppen LD. Improving medical education research. Teach Learn Med. 2007 Fall;19(4):331–5.
18. Reed DA, Kern DE, Levine RB, Wright SM. Costs and funding for published medical education research. JAMA. 2005 Sep 7;294(9):1052–7.
19. Carline JD. Funding medical education research: opportunities and issues. Acad Med. 2004 Oct;79(10):918–24
20. Baernstein A, Liss HK, Carney PA, Elmore JG. Trends in study methods used in undergraduate medical education research, 1969–2007. JAMA. 2007 Sep 5;298(9):1038–45.
21. Sacdeva AK, Pellegrini CA, Johnson KA. Support for simulation-based surgical education through American College of Surgeons–accredited education institutes. World J Surg. 2008 Feb;32(2):196–207.
22. Nielsen-Pincus M, Morse WC, Force JE, Wulfhorst JD. Bridges and barriers to developing and conducting interdisciplinary graduate-student team research. Ecol Soc. 2007;12(2):1–14.
23. Accreditation Council for Graduate Medical Education. Common program requirements [Internet]. The Council; 2007 [cited 21 Dec 2016]. <http://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirementshttp://www.acgme.org>.
24. American Association of Colleges of Nursing. Curriculum guidelines [Internet]. The Association [cited 5 Mar 2017]. <http://www.aacn.nche.edu/faculty/curriculum-guidelines>.
25. Gruppen LD. The department of medical education at the University of Michigan Medical School: a case study in medical education research productivity. Acad Med. 2004 Oct;79(10):997–1002.
26. Oermann MH, Hallmark BF, Haus C, Kardong-Edgren SE, McColgan JK, Rogers N. Conducting multisite research studies in nursing education: brief practice of CPR skills as an exemplar. J Nurs Educ. 2012 Jan;51(1):23–8.
27. Coates WC, Love JN, Santen SA, Hobgood CD, Mavis BE, Maggio LA, Farrell SE. Faculty development in medical education research: a cooperative model. Acad Med. 2010 May;85(5):829–36.
28. Yarris LM, Juve AM, Artino AR Jr, Sullivan GM, Rougas S, Joyce B, Eva K. Expertise, time, money, mentoring, and reward: systemic barriers that limit education researcher productivity-proceedings from the AAMC GEA workshop. J Grad Med Educ. 2014Sep;6(3):430–6.
29. Smith S, Kind T, Beck G, Schiller J, McLauchlan H, Harris M, Gigante J. Further dissemination of medical education projects after presentation at a pediatric national meeting (1998–2008). Teach Learn Med. 2014;26(1):3–8.
30. Thompson BM, Searle NS, Gruppen LD, Hatem CJ, Nelson E. A national survey of medical education fellowships. Med Educ Online. 2011;16(1):5642. DOI: http://dx.doi.org/10.3402/meo.v16i0.5642.
31. Gruppen LD, Yoder E, Frye A, Perkowski LC, Mavis B. Supporting medical education research quality: the Association of American Medical Colleges’ Medical Education Research Certificate Program. Acad Med. 2011 Jan;86(1):122–6.
32. Schwartz A, Young R, Hicks PJ. Medical education practice-based research networks: facilitating collaborative research. Med Teach. 2016;38(1):64–74
33. Price EG, Beach MC, Gary TL, Robinson KA, Gozu A, Palacio A, Smarth C, Jenckes M, Feuerstein C, Bass EB, Powe NR, Cooper LA. A systematic review of the methodological rigor of studies evaluating cultural competence training of health professionals. Acad Med. 2005 Jun;80(6):578–86.
34. Wartman SA. Revisiting the idea of a national center for health professions education research. Acad Med. 2004 Oct;79(10):990–6.
35. Todres M, Stephenson A, Jones R. Medical education research remains the poor relation. BMJ. 2007 Aug 18;335(7615):333–5.
36. Prystowsky JB, Bordage G. An outcomes research perspective on medical education: the predominance of trainee assessment and satisfaction. Med Educ. 2001 Apr;35(4):331–6.
37. Cook DA, Beckman TJ, Bordage G. Quality of reporting of experimental studies in medical education: a systematic review. Med Educ. 2007 Aug;41(8):737–45.
(Return to Top)
Jocelyn Huang Schiller, MD, email@example.com, Associate Professor, Department of Pediatrics and Communicable Diseases, University of Michigan Medical School, Ann Arbor, MI
Gary L. Beck Dallaghan, PhD (corresponding author), firstname.lastname@example.org, Assistant Dean for Medical Education, University of Nebraska College of Medicine, Omaha, NE
Terry Kind, MD, MPH, email@example.com, Associate Professor, Department of Pediatrics, Children’s National and George Washington University School of Medicine, Washington, DC
Heather McLauchlan, MD, firstname.lastname@example.org, Associate Professor, Department of Pediatrics, University of Illinois College of Medicine at Peoria, Peoria, IL
Joseph Gigante, MD, email@example.com, Associate Professor, Department of Pediatrics, Vanderbilt University School of Medicine, Nashville, TN
Sherilyn Smith, MD, firstname.lastname@example.org, Professor, Department of Pediatrics, University of Washington, Seattle, WA
Articles in this journal are licensed under a Creative Commons Attribution 4.0 International License.
This journal is published by the University Library System of the University of Pittsburgh as part of its D-Scribe Digital Publishing Program and is cosponsored by the University of Pittsburgh Press.
Journal of the Medical Library Association, VOLUME 105, NUMBER 4, October 2017