Differential diagnosis in primary care is a cornerstone of effective medical practice. For medical students, grasping the complexities of diagnosing a wide array of conditions presenting with similar symptoms is crucial. Traditional medical education often exposes students to a vast spectrum of diseases, including rare conditions, which may not be representative of the daily challenges in primary care settings. Recognizing this gap, an innovative elective seminar was designed to focus on the most common underlying causes of prevalent symptoms encountered in primary care, enhancing diagnostic skills for future physicians. This approach emphasizes diagnostic accuracy concerning frequent clinical presentations like chest pain, dyspnea, abdominal pain, and vertigo, directly relevant to primary care practice and preparing students for real-world scenarios.
This specialized seminar, totaling 42 hours of face-to-face instruction, adopts an interdisciplinary approach, presenting clinical scenarios reflective of everyday primary care. Unlike conventional medical curricula that might delve into the esoteric, this course deliberately concentrates on the common threads of illnesses presenting with typical lead symptoms. The core objective is to sharpen the diagnostic acumen of medical students, specifically concerning the accuracy of symptoms and signs in differentiating between various common underlying diseases. Held at the interactive skills lab of Marburg University Hospital, the seminar leverages trained simulation patients and diverse medical models to create an immersive learning environment. This hands-on approach is complemented by a concluding Objective Structured Clinical Examination (OSCE) to assess the students’ acquired competencies in differential diagnosis.
Recognizing the evolving landscape of medical pedagogy, the seminar underwent a significant redesign one year prior, embracing the inverted classroom model. This pedagogical shift positions the seminar as a pioneering initiative within the medical faculty, being the sole medical course utilizing this modern educational design. Furthermore, to our knowledge, this method remains uncommon in primary care departments across Germany and is not yet widely adopted in medical education at German Universities.
The seminar’s e-learning components are delivered through the University of Marburg’s web-based learning platform, ‘K-Med,’ ensuring accessibility and flexibility for students. Each session is thoughtfully structured into three distinct phases to maximize learning efficacy:
-
Preparation: Before each face-to-face session, students engage with introductory video and audio lectures available online. These modules lay the foundational knowledge and present key concepts essential for the upcoming interactive session.
-
Face-to-face teaching: The interactive skills lab becomes the hub for active learning. Here, a variety of didactic strategies are employed, including interactions with simulation patients, hands-on training with medical models, collaborative small group work, and engaging quiz exercises. This phase is designed to solidify understanding and apply theoretical knowledge in a practical context.
-
Follow-up: Post-session, students access supplementary video and audio lectures that delve deeper into specific leading symptoms and their diverse underlying etiologies. To further enrich their learning, optional reading materials, such as original research articles focusing on the diagnostic precision of symptoms and signs for particular diseases, are made available via the online platform. This ensures continuous learning and allows students to explore topics in greater depth based on their interests.
To rigorously evaluate the effectiveness of this redesigned seminar, a mixed-methods study design was implemented to address key research questions. Student satisfaction, a primary focus, was assessed through multiple channels. Firstly, the University of Marburg’s standardized seminar evaluation questionnaire was utilized. This tool employs a 5-point rating scale, ranging from “agree not at all” to “completely agree,” across four core domains: seminar concept and presentation, student interaction, interest/relevance level, and difficulty/quantity/speed. The anonymous feedback collected through this questionnaire is analyzed by the university’s evaluation department, providing quantitative data on student perceptions.
Complementing the standardized questionnaire, a more in-depth evaluation was conducted after the 8th of the 14 course sessions. This involved a focus group discussion followed by a short questionnaire distributed immediately after the discussion. The focus group and questionnaire explored students’ perceptions of the seminar content, their critical appraisal of the inverted classroom approach (specifically the utility of video and audio sessions and their integration with face-to-face sessions), and their individual learning experiences. The emphasis was placed on understanding the student learning journey and, in particular, how the inverted classroom methodology was received and experienced.
For the objective measurement of skill and knowledge acquisition, a questionnaire was specifically designed, incorporating extended matching questions (13 items) and key-feature tests (20 items). These assessment formats are recognized for their ability to evaluate clinical reasoning processes and assess clinical decision-making capabilities. They are well-suited for gauging competence at the “Knows how” level of Miller’s Pyramid, focusing on the practical application of knowledge in clinical scenarios. Extended matching questions were chosen for their efficacy in determining the diagnostic value of clinical findings derived from patient history and physical examinations. Key-feature tests further probe clinical decision-making in context-rich scenarios.
Collectively, these question formats comprehensively covered the seminar’s core content, ensuring a robust assessment of learning outcomes. Prior to implementation, the questions underwent pre-testing with a cohort of students possessing comparable baseline knowledge. Based on pre-test results, adjustments were made to enhance question clarity and difficulty, ensuring the assessments were appropriately challenging and effectively measured the intended learning objectives. Students completed the pre-test at the seminar’s outset and the post-test following the final session, just before commencing preparations for the OSCE. Unique identifiers facilitated anonymous matching of pre- and post-tests at the individual student level, enabling a precise analysis of learning gains. Example questions from the assessment questionnaire are available in Additional file 1, providing further insight into the evaluation instruments used.
Data analysis involved calculating means and standard deviations for the four variables from the standardized student satisfaction questionnaire, which were then graphically represented for clarity. To assess the statistical significance of differences between pre- and post-test results (average percentage of maximal test score), the Wilcoxon matched-pairs signed-ranks test was employed. A p-value of less than 0.05 was considered statistically significant, and statistical analyses were performed using GraphPad PRISM (Version 6 Graphpad Software, Inc). The focus group discussion was meticulously transcribed verbatim, and data analysis followed a deductive approach, guided by the focus group and short questionnaire questions. Analysis was conducted by SB, with results discussed and refined among all authors. Responses from focus groups and free-text answers from questionnaires were thematically categorized to identify recurring patterns and insights. The data presented forms part of the routine course evaluation at our faculty, and therefore, ethical approval was not deemed necessary, aligning with institutional guidelines for course improvement and quality assurance.