Acta Scientific Medical Sciences (ASMS)(ISSN: 2582-0931)

Review Article Volume 8 Issue 5

The Medical Residency Examination in the Era of the Overhaul of Medical Study Programs, a Call for Reform. Reflections for New Docimological Orientations

Mohamed Ridha Guedjati1*, Jalaleddinne Omar Bouhidel1, Hanane Benaldjia1, Chaouki Derdous1

1Faculty of Medicine, Batna, Algeria
2University of Batna 2, Batna, Algeria

*Corresponding Author: Mohamed Ridha Guedjati, Faculty of Medicine, Batna, Algeria.

Received: March 11, 2024; Published: April 19, 2024

Abstract

The overhaul of medical school graduation programs is part of an innovative approach to quality. Medical educators continually seek to bridge the gap between the needs of medical practice and the growing expectations of their country's communities. The introduction of new programs in 2018, as part of this overhaul, incorporates conceptual curricular aspects such as the organization of studies into three cycles, the competency-based teaching approach, the objectives of training programs, simulation-based learning and objective structured clinical examinations (OSCEs). At the end of the 1st and 2nd cycles, discussions are underway to design the 3rd cycle. In these discussions, access to the various specialties is conditional on passing the residency examination. In this paper, we propose a number of ideas on the possible docimological orientations to be envisaged in the continuity of the innovative framework of the reorganization of medical studies in Algeria. Therefore, this article was conducted to improve the efficiency, equity, and integrity of the resident selection process. We aim to provide some reflections for educational leaders with a clear framework and consistent language to facilitate national discussions. 

 Keywords: Residency Examination; MCQ; KFP; Validity; Fidelity

References

  1. Buja LM. “Medical education today: all that glitters is not gold”. BMC Medical Education 19 (2019): 110.
  2. République algérienne. Arrêté N° 1321 du 25 décembre 2018 portant sur les modalités d’organisation d’évaluation et de progression du 1ercycle de graduation de médecine.
  3. République algérienne. Arrêté N° 1058 du 07 juillet 2019 portant sur les programmes d’enseignement de la 2ème année de médecine.
  4. République algérienne. Arrêté N° 914 du 18 Aout 2021 portant sur les programmes d’enseignement de la 4ème année de graduation de médecine.
  5. République algérienne. Arrêté 1109 du 14 octobre 2021 fixant les modalités d’organisation d’évaluation et de progression du cycle des études de graduation de médecine.
  6. Lorenzo M. “Les problèmes à éléments clés”. EXERCER176 (2021): 375.
  7. Guedjati M R and Bounecer H. “Le concours de résidanat de médecine, peut-il servir à autre chose qu’au classement des candidats?” Conférence. Pédagogie Médicale 10 (2009): S37-S37.
  8. Roussel F. “Pour une transformation des épreuves classantes nationalisent, un examen Classant National”. Pédagogie Médicale4 (2006): 228-232.
  9. Jouquan J and Honnorat C. “Que reste-t-il comme vertus aux épreuves classantes nationales?” Pédagogie Médicale4 (2006): 197-200.
  10. République algérienne. Arrêté N° 372 du 4 avril 2017 portant création du comité pédagogique national de filière médecine, ses missions, sa composition et son fonctionnement.
  11. Krathwohl DR. “A revision of Bloom’s taxonomy: an overview”. Theory into Practice 41 (2002): 212-218.
  12. Miller GE. “The assessment of clinical skills/competence/performance”. Academic Medicine9 (1990): S63-67.
  13. République française. Arrêté du 2 septembre 2020 portant modification de diverses dispositions relatives au régime des études en vue du premier et du deuxième cycle des études médicales et à l’organisation des épreuves classantes nationales.
  14. Veber B. “La réforme du 2e cycle des études Médicales - Quels Objectifs, Quelle Organisation, Dans Quels Délais?” La Presse Médicale Formation6 (2022): 482-484.
  15. Murias Quintana E., et al. “Improving the ability to discriminate medical multiple-choice questions through the analysis of the competitive examination to assign residency positions in Spain”. BMC Medical Education 24 (2024): 367 (2024).
  16. Medical Education Model Core Curriculum Expert Research Committee, The model core curriculum for medical education in Japan, (2022).
  17. NBME® Item-writing guide (2020).
  18. Schuwirth LWT and van der Vleuten CPM. “How to design a useful test: the principles of assessment. Understanding medical education: evidence, theory and practice”. Wiley-Blackwell, Malaysia (2010): 195-207.
  19. Van Der Vleuten CP. “The assessment of Professional Competence: Developments, research and practical implications”. Advances in Health Sciences Education1 (1996): 41-67.
  20. Oliver C. “Developing and maintaining an assessment system: a PMETB guide to good practice”. (2007).
  21. Hift RJ. “Should essays and other “open-ended”-type questions retain a place in written summative assessment in Clinical Medicine?” BMC Medical Education1 (2014).
  22. Kelley TL. “Interpretation of educational measurements”. newyork, ny: Macmillan (1927).
  23. Okubuiro EO., et al. “Utility of single best answer questions as a summative assessment tool in medical education: a review”. International Journal of Recent Innovations in Academic Research1 (2019): 1-12.
  24. Jolly B. “Written examinations”. Understanding Medical Education (2010): 208-231.
  25. Mirbahai L and W Adie J. “Applying the utility index to review single best answer questions in Medical Education Assessment”. Archives of Epidemiology and Public Health1 (2020).
  26. Holsgrove G and Elzubeir M. “Imprecise terms in UK medical multiple-choice questions: what examiners think they mean”. Medical Education4 (1998): 343-350.
  27. YAHIA AI. “Post-validation item analysis to assess the validity and reliability of multiple-choice questions at a medical college with an innovative curriculum”. The National Medical Journal of India 34 (2022): 359-362.
  28. André N., et al. “La validité psychométrique : Un regard global sur le concept centenaire, SA Genèse, SES avatars”. Mesure et évaluation en education3 (2016): 125-148.
  29. Guilford JP. “New standards for test evaluation”. Educational and Psychological Measurement4 (1946): 427-438.
  30. Angoff w H. “Validity: an evolving concept”. In H. wainer and H. I. Braun (Eds.), Test validity (p. 19-32). Hillsdale, nJ: routledge (1988).
  31. Newton PE. “Two kinds of argument?” Journal of Educational Measurement1 (2013): 105-109.
  32. Crocker L and Algina J. “Introduction to classical and modern test theory”. Orlando, FL: Harcourt Brace Jovanovitch (1986).
  33. Kelley TL. “The reliability coefficient”. Psychometrika2 (1942): 75-83.
  34. Cronbach LJ and Meehl PE. “Construct validity in psychological tests”. Psychological Bulletin4 (1955): 281-302.
  35. Guilber JJ. “Guide pédagogique pour les personnels de santé (6ème édition)”. WHO Genève (1990).
  36. Wood T and Cole G. “L’élaboration de questions à choix multiples pour les examens menant au certificat du Collège royal des médecins et chirurgiens du Canada”.
  37. Vieux R., et al. “Capacités discriminantes et caractère prédictif d’une épreuve de type « épreuves Classantes nationales » en France”. Pédagogie Médicale3 (2011): 159-168.
  38. Pottier P., et al. “Etude d’impact de séances d’entraînement à des exercices de dossiers cliniques simulés sur les performances d’étudiants à un examen national”. Pédagogie Médicale4 (2006): 213-227.
  39. Case S and Swanson D. “Constructing written test questions for the basic and clinical sciences”. 3rd Philadelphia: National Board of Medical Examiners (2000).
  40. Paniagua MA and Katsufrakis P. “The National Board of Medical Examiners: Testing and evaluation in the United States and internationally”. Investigación en Educación Médica1 (2019): 9-12.
  41. Page G and Bordage G. “The Medical Council of Canadaʼs Key Features Project”. Academic Medicine2 (1995): 104-110.
  42. Hatala R and Norman GR. “Adapting the key features examination for a clinical clerkship”. Medical Education2 (2002): 160-165.
  43. Bordage G and Page G. “The key-features approach to assess clinical decisions: Validity evidence to date”. Advances in Health Sciences Education5 (2018): 1005-1036.
  44. Hrynchak P., et al. “Key-feature questions for assessment of clinical reasoning: A literature review”. Medical Education9 (2014): 870-883.
  45. Nayer M., et al. “Twelve tips for developing key-feature questions (KFQ) for effective assessment of clinical reasoning”. Medical Teacher11 (2018): 1116-1122.
  46. Bordage G and Page G. “Key features to assess clinical decisions”. Medical Teacher11 (2018): 1195-1196.
  47. Farmer EA and Page G. “A practical guide to assessing clinical decision-making skills using the key features approach”. Medical Education12 (2005): 1188-1194.
  48. Sibert L. “Comment faire un bon KFP en pratique?” (2021).
  49. Fournier JP. “Apports pédagogiques et caractéristiques psychométriques des KFP” (2021).

Citation

Citation: Mohamed Ridha Guedjati., et al. “The Medical Residency Examination in the Era of the Overhaul of Medical Study Programs, a Call for Reform. Reflections for New Docimological Orientations”.Acta Scientific Medical Sciences 8.5 (2024): 63-68.

Copyright

Copyright: © 2024 Mohamed Ridha Guedjati., et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.




Metrics

Acceptance rate30%
Acceptance to publication20-30 days
Impact Factor1.403

Indexed In





Contact US