• Home
  • Search
  • Browse Collections
  • My Account
  • About
  • DC Network Digital Commons Network™
Skip to main content
Jefferson University logo Jefferson Libraries Home Academic Commons Home Search
  • Home
  • About
  • Submit Research
  • My Account
Jefferson Digital Commons

Home > SKMC > MEDICALED > MEDICALEDPOSTERS

Department of Medical Education Posters

 
Printing is not supported at the primary Gallery Thumbnail page. Please first navigate to a specific Image before printing.

Follow

Switch View to Grid View Slideshow
 
  • “Coherent Nonsense”: Lessons Learned from Utilizing ChatGPT for USMLE-Style Anatomy and Pathology Questions by Alexander Macnow, MD, MHPE

    “Coherent Nonsense”: Lessons Learned from Utilizing ChatGPT for USMLE-Style Anatomy and Pathology Questions

    Alexander Macnow, MD, MHPE

    Undergraduate medical education (UME) encounters several challenges, including the need for adequate practice items replicating standardized medical exams like the United States Medical Licensing Examination (USMLE).1 This demand suggests a role for innovative, efficient approaches to item generation. Artificial intelligence (AI) large language models (LLM), like those employed by ChatGPT, present an attractive solution.

    Previous authors have investigated ChatGPT’s ability to “pass” high-stakes assessments, such as the USMLE,2–4 the ophthalmology and radiology board examinations,5,6 and other nations’ certification examinations.7 Less literature has been published on ChatGPT’s ability to construct vignette-based single best answer multiple choice items similar to those employed by these assessments,8–10 and these studies employ broad categories of item flaws and scant comparative psychometric analysis of item performance.

    This study investigated the utility and feasibility of ChatGPT as an author of USMLE-style questions, with the following research questions:

    1. Once fine-tuned, can ChatGPT successfully generate factually accurate questions that adhere to predetermined style and content guidelines?
    2. How efficient is ChatGPT at writing questions, compared to human subject matter experts?
    3. Do the psychometric characteristics of ChatGPT’s items differ from human-written ones?

  • 3D Printed Coronary Arteries: A Useful Tool in Anatomical Education by Zachary Pang; Shreyas Chandragiri; Alexander Hajduczok, MD; Zachary Mace; Jesse Ottaway; Cannon Greco Hiranaka; Michael Wong; Tammy Yoshioka; Noah Haroian, MD, PharmD; and Guiyun Zhang, MD, PhD

    3D Printed Coronary Arteries: A Useful Tool in Anatomical Education

    Zachary Pang; Shreyas Chandragiri; Alexander Hajduczok, MD; Zachary Mace; Jesse Ottaway; Cannon Greco Hiranaka; Michael Wong; Tammy Yoshioka; Noah Haroian, MD, PharmD; and Guiyun Zhang, MD, PhD

    Introduction

    Learning 3D anatomy from 2D figures can be challenging. The use of 3D printing in medical education has shown to improve knowledge and skill in surgical training(1). We investigated the efficacy of 3D printed coronary artery models in teaching coronary anatomy to medical students.

 
 
 

Browse

  • Collections
  • Authors
  • Disciplines

Search

Advanced Search

  • Notify me via email or RSS

Author Corner

  • Copyright & Fair Use
  • What is Open Access?
  • Open Access Publishing Fund

About the JDC

  • What People Are Saying About the JDC
  • Frequently Asked Questions

Links

  • JDC Release Form
  • Feedback Form
  • Twitter
  • Instagram
 
Elsevier - Digital Commons

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright