Case Report

Video killed the multiple-choice quiz: capturing pharmacy students' literature searching skills using a screencast video assignment


Emily P. Jones1, Christopher S. Wisniewski2


doi: http://dx.doi.org/10.5195/jmla.2021.1270

Volume 109, Number 4: 672-676
Received 04 2021: Accepted 07 2021

ABSTRACT

Background:

In a flipped, required first-year drug information course, students were taught the systematic approach to answering drug information questions, commonly utilized resources, and literature searching. As co-coordinator, a librarian taught three weeks of the course focused on mobile applications, development of literature searching skills, and practicing in PubMed. Course assignments were redesigned in 2019 based on assessment best practices and replaced weekly multiple-choice quizzes used in prior iterations of the course.

Case Presentation:

Following two weeks of literature searching instruction, students were assigned a drug information question that would serve as the impetus for the search they conducted. Students (n=66) had one week to practice and record a screencast video of their search in PubMed. Students narrated their video with an explanation of the actions being performed and were assessed using a twenty-point rubric created by the course coordinator and librarian. The librarian also created general feedback videos for each question by recording screencasts while performing the literature searches and clarifying troublesome aspects for students. The librarian spent about twenty-four hours grading and six hours writing scripts, recording, and editing feedback videos.

Conclusion:

Most students performed well on the assignment and few experienced technical difficulties. Instructors will use this assignment and feedback method in the future. Screencast videos proved an innovative way to assess student knowledge and to provide feedback on literature searching assignments. This method is transferrable to any medical education setting and could be used across all health professions to improve information literacy skills.

Keywords: screencast videos; competency-based assignment; pharmacy students; drug information.

BACKGROUND

Adult learning theory and instructional design best practices have long established that effective assessment techniques are linked to desired learning outcomes [17]. Additionally, assessments should be constructed in a manner that can actually measure student learning of material. Backward design is a particularly effective instructional design technique in which one begins with the end goal first and then works backward to develop appropriate assessment methodologies, learning activities, and course content [8]. This approach helps ensure planned learning activities and assessments align with desired learning outcomes. Despite helpful frameworks and learning theory as guidance, multiple-choice question assessments are often used in higher education, particularly in medical education, even when other methodologies are better suited to test skill acquisition or competencies [9]. As such, real-life examples of how educational activities are developed using backward design may prove helpful to educators who can utilize them to create assignments with this technique. This report presents one example of how librarians can move beyond multiple-choice question assessments and develop and implement innovative assignments to better evaluate student learning.

The educational activity was first implemented in Introduction to Drug Information, a course required in the first year of a traditional four-year Doctor of Pharmacy curriculum at the Medical University of South Carolina in fall 2019. The focus of this course was to teach the systematic approach to answering drug information questions [10] and to simulate real-world experiences for students to practice utilizing various tertiary and secondary resources. Course content was divided into four main components: identifying genuine need and categorization (i.e., designating questions to predefined drug information categories [11] like adverse effects, drug interactions, and pregnancy/lactation), tertiary resources, secondary resources, and analysis and synthesis.

The course was delivered via a flipped format where students watched instructor-created videos and engaged with material before attending a live class session focused on active learning and practical application of previously taught content. Student grades were comprised of six different components (Table 1), and final grades were reported on a pass-fail grading scale; a passing grade was determined by an overall course score of 70% or greater. The librarian was responsible for teaching approximately one-third of course content (specifically, weeks 6, 8, and 9) pertaining to mobile applications, secondary databases, and literature searching (Table 1).

Table 1

Course schedule

Systematic approach component Weekly schedule
Identifying genuine need and categorization Week 1—Background & ultimate question development & categorization
Week 2—Systematic approach quiz due*
Tertiary Week 3—Tertiary resources characteristics & utilization
Week 4—Online tertiary compendia
Week 5—Internet resources
Week 6—Mobile applications
Week 7—Tertiary resources quiz due*
Secondary Week 8—Literature searching basics
Week 9—Advanced literature searching techniques & practice
Week 10—Literature searching video due*
Analysis and synthesis Week 11—Question #1: Drug information center case
Week 12—Question #2: Community case
Week 13—Question #3: Hospital case
Week 14—Question #4: Clinical case
Week 15—Drug information question response due*
*With the exception of the drug information question response (50%), all assignments listed, plus participation and attendance, were each worth 10%

The librarian and course coordinator revised the summative assessment of literature searching ability from multiple-choice quizzes used in previous years to a skills-based competency. This change was made due to personal reflection of how to better assess student knowledge of literature searching skills using a backward design approach. The objective of this report is to describe the development of this innovative assessment methodology and to provide information regarding student performance and feedback on the competency-based assignment.

CASE PRESENTATION

During two weeks of instruction covering literature searching using PubMed, students were taught a broad range of topics. Concepts included controlled vocabulary and text terms, Boolean operators, expanding or narrowing search results, and phrase searching and truncation. As an introductory course, the goal of these instruction sessions was to provide students with a broad overview of literature searching mechanics that they could refine toward expertise during their education.

Following introduction of these concepts, students were assigned a drug information question that served as the impetus for the literature searching assignment. Eight total questions, two each related to a specific category, were divided among the students. The categories and an example question from each is presented in Table 2.

Table 2

Drug information categories and questions used in assignment

Category Question
Compounding Does cinacalcet come as a suspension? If not, can you make it?
Drug-lab interaction What drugs can cause a false positive benzodiazepine result on a urine drug screen?
Herbal product Can kava be used to treat insomnia?
Method of administration Can warfarin be given through a feeding tube?

A twenty-point rubric for this assignment was created to encourage students to develop, but not master, these skills (Appendix A). The rubric was peer reviewed by two other librarians for clarity. The rubric was then provided to students in the assignment instructions and integrated into the learning management system (LMS) gradebook.

Students (n=66) had one week to practice and record a screencast video of their finalized literature search in PubMed utilizing an institutional subscription to Panopto, an online video platform. Assignment instructions (Appendix B) indicated the video should be five minutes or less and that students should narrate their video with an explanation of the actions being performed. Students were provided the grading rubric prior to completing the assignment.

All students completed the literature searching video assignment. Overall, students performed well on the assignment [mean score=17.35/20; 86.75% (IQR 80–90)]. Few experienced technical difficulties, and those who did were given additional information on how to record or upload their videos.

The librarian also created general feedback videos for each drug information question (eight total) utilizing Panopto. Videos depicted the librarian performing searches while following the rubric and clarifying troublesome aspects for students. Feedback videos were available to students via the LMS after grades were released. In total, the librarian spent about twenty-four hours grading and six hours writing scripts, recording, and editing feedback videos.

DISCUSSION

Overall, this assignment was well received by students; 91% of students stated they felt their literature searching skills improved in a course evaluation survey, a notable improvement from 83% the prior year. Instructors anticipated there to be more technical issues with completing the assignment. To the best of the authors' knowledge, this is the first assignment in the curriculum that required students to record a screencast video. Despite their lack of exposure to this technique, students required minimal instruction on how to create their screencast video using Panopto or how to upload it to the LMS; these instructions were provided in writing in the assignment handout (Appendix B). Few issues were encountered and were easily circumvented.

While librarian-created screencasts to teach database searching are well established in the literature, particularly in the form of tutorial videos [1220], literature on librarian use of screencast assignments to assess student learning is sparse. The only publication identified is by Kuban and Mulligan [21], who wrote about a database searching screencast assignment in a journalism research course taught mainly to first-year undergraduates. The focus of their assignment was to teach introductory information literacy skills. Their rubric consisted of five elements, only one of which was specific to database searching mechanics. The other elements focused on background information of the database, when and why to search the specified database, exporting results, and credibility of an identified source.

To the best of the authors' knowledge, there are no publications on student-created screencasts as literature searching assignments in any health professions educational setting. Therefore, this screencast method of assessment is innovative and novel in health care education and can easily be used to evaluate literature searching abilities of students irrespective of discipline. While implementation of an assignment as this may have existed in the past, the ability to create this type of assignment is readily available to students via freely available software, institutionally subscribed resources, or their personal smartphone or tablet. Screencast assignments are documented in the literature of other disciplines like accounting [2224] and education [25, 26]. Faculty use of screencasts to deliver feedback on assignments is also demonstrated in the literature and was perceived by students taking part in the published studies to be more effective and favorable to written feedback [2730].

Utilizing this skills-based competency instead of multiple-choice quizzes did add a moderate time strain on the librarian. The amount of time dedicated to this assignment was mostly devoted to grading student videos. This was the first year a competency-based assignment was implemented, and it is possible grading will take less time in future iterations as the librarian becomes more familiar with the process.

Instructors felt implementing this original and modern assignment was well worth the time investment compared to multiple-choice quizzes used in previous iterations of the course for multiple reasons. First, while students were instructed to work individually on multiple-choice quizzes used in prior years, there is no guarantee they did so. The screencast assignment ensured each student completed the assignment individually and provided more useful information to instructors about student learning. The videos allowed instructors to see exactly where students struggled in both the actions being performed or the narration of the video. Alternatively, student performance on multiple-choice quizzes can be misleading; students can correctly guess on questions to which they do not actually know the answer, thus providing inaccurate feedback about student knowledge. This screencast assignment also ensured student knowledge of literature searching would be assessed based on their ability to perform a literature search, not rote memorization or guesswork.

Writing scripts for feedback videos and then recording and editing them also required significant time. The librarian was very familiar with the process of writing scripts and recording screencasts using Panopto and had performed these tasks for numerous videos as part of the flipped classroom design of this course in the three years prior to this assignment. It was important to the instructors to provide clear guidance on this assignment to students, and both educators felt it would be more effective to visually see and hear explanations behind those actions instead of written feedback. Additionally, it was important to provide useful feedback because the final assignment, a written drug information question response, applied to the same case used for the literature searching assignment. Designed based on scaffolding learning theory [31], students continued with the assigned drug information question and had to utilize skills learned throughout the entire semester to find applicable information and provide a written response.

Instructors noted sections of the rubric that students expressed to be unclear after grades were released. For example, students thought they should get full credit for showing use of Boolean operators in the search, not understanding they needed to use them optimally to receive full points. In addition to clarifying how advanced searching is defined, incentive for keeping the video under five minutes was added by removing one point from the credit for article submission. The initial rubric used in fall 2019 can be found in Appendix A and the revised version, used in fall 2020 and after, in Appendix B.

In conclusion, screencast videos proved an innovative way to assess student knowledge and to provide feedback on literature searching assignments. Instructors feel this competency-based assignment was a better gauge of student knowledge than multiple-choice quizzes. Additionally, students received individualized, written feedback via the rubric and general feedback, presented visually and auditorily, via librarian-created screencasts. Instructors feel delivery of feedback through screencasts could have been more meaningful to students because it was presented in the same manner students completed the assignment. Instructors will continue to use this method for assignment and feedback in the future, with minor changes to the rubric planned before the next course offering to decrease student confusion. This method of assessing literature searching skills can be easily extrapolated to the education of students training in other medical fields.

DATA AVAILABILITY STATEMENT

There are no data associated with this article.

SUPPLEMENTAL FILES

Appendix A. Literature Searching Video Assignment Fall 2019

Appendix B. Literature Searching Video Assignment Fall 2020

REFERENCES

1. Biggs JB. Aligning teaching and assessing to course objectives. Teaching and Learning in Higher Education: New Trends and Innovations. 2003 Apr.

2. Branch RM. Design. In: Instructional design: the ADDIE approach. Boston, MA: Springer; 2009. p. 58–81.

3. Crespo RM, Najjar J, Derntl M, Leony D, Neumann S, Oberhuemer P, Totschnig M, Simon B, Gutierrez I, Kloos CD. Aligning assessment with learning outcomes in outcome-based education. Presented at: IEEE EDUCON 2010 Conference; Madrid, Spain; April 14–16, 2010. p. 1239–46.

4. Sewagegn AA. Learning objective and assessment linkage: its contribution to meaningful student learning. Univers J Edu Res. 2020 Nov;8(11):5044–52. DOI: https://doi.org/10.13189/ujer.2020.081104
cross-ref.

5. Pape-Zambito DA, Mostrom AM. Improving teaching through triadic course alignment. J Microbiol Biol Educ. 2019 Jan;19(3):1642. DOI: https://doi.org/10.1128/jmbe.v19i3.1642
cross-ref.

6. Guerrero-Roldán AE, Noguera I. A model for aligning assessment with competences and learning activities in online courses. Internet Higher Educ. 2018 Jul;38:36–46. DOI: https://doi.org/10.1016/j.iheduc.2018.04.005
cross-ref.

7. Shepard LA. The role of assessment in a learning culture. Educ Res. 2000 Oct;29(7):4–14. DOI: https://doi.org/10.3102/0013189X029007004
cross-ref.

8. Wiggins G, McTighe J. Backward design. In: Understanding by design. 2nd ed. Alexandria, VA: Association for Supervision and Curriculum Development (ASCD); 2005. p. 13–34.

9. Brown GTL, Abdulnabi HHA. Evaluating the quality of higher education instructor-constructed multiple-choice tests: impact on student grades. Front Educ. 2017 Jun;2. DOI: https://doi.org/10.3389/feduc.2017.00024
cross-ref.

10. Watanabe AS, McCart G, Shimomura S, Kayser S. Systematic approach to drug information requests. Am J Health Syst Pharm. 1975 Dec;32(12):1282–5.

11. Sheehan AH, Jordan JK. Formulating an effective response: a structured approach. In: Malone PM, Malone MJ, Park SK, eds. Drug information: a guide for pharmacists. 6th ed. United States: McGraw-Hill; 2018.

12. Brown-Sica M, Sobel K, Pan D. Learning for all: teaching students, faculty, and staff with screencasting. Public Services Quarterly. 2009 May;5(2):81–97. DOI: https://doi.org/10.1080/15228950902805282
cross-ref.

13. Baker A. Students' preferences regarding four characteristics of information literacy screencasts. Journal of Library & Information Services in Distance Learning. 2014 Jan;8(1-2):67–80. DOI: https://doi.org/10.1080/1533290X.2014.916247
cross-ref.

14. Bailey J. Informal screencasting: results of a customer-satisfaction survey with a convenience sample. New Library World. 2012 Jan;113(1-2):7–26. DOI: https://doi.org/10.1108/03074801211199013
cross-ref.

15. Ergood A, Padron K, Rebar L. Making library screencast tutorials: factors and processes. Internet Reference Services Quarterly. 2012 Apr;17(2):95–107. DOI: https://doi.org/10.1080/10875301.2012.725705
cross-ref.

16. Mestre LS. Student preference for tutorial design: a usability study. Ref Serv Rev. 2012 May;40(2):258–76. DOI: https://doi.org/10.1108/00907321211228318
cross-ref.

17. Murphy J, Liew CL. Reflecting the science of instruction? Screencasting in Australian and New Zealand academic libraries: a content analysis. The Journal of Academic Librarianship. 2016 May;42(3):259–72. DOI: https://doi.org/10.1016/j.acalib.2015.12.010
cross-ref.

18. Visser N. Did we captivate them? Perceptions of second-year students about the library's information literacy online tutorials. Mousaion. 2013;31(2):78–91.

19. Gamtso CW, Halpin PA. Tailoring library instruction for non-science majors taking hybrid and online science classes: student perceptions of information literacy in the virtual environment. Public Services Quarterly. 2018 Apr;14(2):99–118. DOI: https://doi.org/10.1080/15228959.2017.1372729
cross-ref.

20. Jacklin ML, Robinson K. Evolution of various library instruction strategies: using student feedback to create and enhance online active learning assignments. Partnership: The Canadian Journal of Library & Information Practice & Research. 2013 Jun;8(1):1–21. DOI: https://doi.org/10.21083/partnership.v8i1.2499
cross-ref.

21. Kuban AJ, Mulligan LM. Screencasts and standards: connecting an introductory journalism research course with information literacy. Commun Teach. 2014 Jul;28(3):188–95. DOI: https://doi.org/10.1080/17404622.2014.911335
cross-ref.

22. Wakefield J, Tyler J, Dyson LE, Frawley JK. Implications of student-generated screencasts on final examination performance. Account Finance. 2019 Jun;59(2):1415–46. DOI: https://doi.org/10.1111/acfi.12256
cross-ref.

23. Frawley JK, Dyson LE, Wakefield J, Tyler J. Supporting graduate attribute development in introductory accounting with student-generated screencasts. International Journal of Mobile and Blended Learning. 2016 Jul;8(3):65–82. DOI: https://doi.org/10.4018/IJMBL.2016070105
cross-ref.

24. Sudhakar A, Tyler J, Wakefield J. Enhancing student experience and performance through peer-assisted learning. Issues in Accounting Education. 2016 Aug;31(3):321–36. DOI: https://doi.org/10.2308/iace-51249
cross-ref.

25. Ranellucci J, Bergey BW. Using motivation design principles to teach screencasting in online teacher education courses. Journal of Technology and Teacher Education. 2020 Jan; 28(2):393–401.

26. Galligan L, Hobohm C, Mathematics Education Research Group of Australasia. Students using digital technologies to produce screencasts that support learning in mathematics. Presented at: The Annual Meeting of the Mathematics Education Research Group of Australia (MERGA); Victoria, Australia; 2013.

27. Bush JC. Using screencasting to give feedback for academic writing. Innovation in Language Learning and Teaching. 2020 Nov;1–14. DOI: https://doi.org/10.1080/17501229.2020.1840571
cross-ref.

28. Cunningham KJ. Student perceptions and use of technology-mediated text and screencast feedback in esl writing. Comput Compos. 2019 Jun;52:222–41. DOI: https://doi.org/10.1016/j.compcom.2019.02.003
cross-ref.

29. Phillips M, Ryan T, Henderson M. A cross-disciplinary evaluation of digitally recorded feedback in higher education. Presented at: ASCILITE 2017, the 34th International Conference of Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education; Toowoomba; Australia; December 4–6, 2019.

30. Teoh L, Marriott P. Delivering screen-cast feedback on assessments to large cohorts. Presented at: IADIS International Conference e-Learning 2011, Part of the IADIS Multi Conference on Computer Science and Information Systems; Rome, Italy; July 20–23, 2011.

31. Wood D, Bruner JS, Ross G. The role of tutoring in problem solving. J Child Psychol & Psychiat. 1976 Apr;17(2):89–100. DOI: https://doi.org/10.1111/j.1469-7610.1976.tb00381.x
cross-ref.


Emily P. Jones, 1 epjones3@email.unc.edu, Health Sciences Librarian, Health Sciences Library, University of North Carolina at Chapel Hill, Chapel Hill, NC

Christopher S. Wisniewski, 2 wisniews@musc.edu, Professor, Department of Clinical Pharmacy and Outcome Sciences, Medical University of South Carolina, Charleston, SC


Copyright © 2021 Emily P. Jones, Christopher S. Wisniewski

This work is licensed under a Creative Commons Attribution 4.0 International License.



Journal of the Medical Library Association, VOLUME 109, NUMBER 4, October 2021