Literature DB >> 30216508

What we measure … and what we should measure in medical education.

John R Boulet1, Steven J Durning2.   

Abstract

CONTEXT: As the practice of medicine evolves, the knowledge, skills and attitudes required to provide patient care will continue to change. These competency-based changes will necessitate the restructuring of assessment systems. High-quality assessment programmes are needed to fulfil health professions education's contract with society.
OBJECTIVES: We discuss several issues that are important to consider when developing assessments in health professions education. We organise the discussion along the continuum of medical education, outlining the tension between what has been deemed important to measure and what should be measured. We also attempt to alleviate some of the apprehension associated with measuring evolving competencies by discussing how emerging technologies, including simulation and artificial intelligence, can play a role.
METHODS: We focus our thoughts on the assessment of competencies that, at least historically, have been difficult to measure. We highlight several assessment challenges, discuss some of the important issues concerning the validity of assessment scores, and argue that medical educators must do a better job of justifying their use of specific assessment strategies. DISCUSSION: As in most professions, there are clear tensions in medicine in relation to what should be assessed, who should be responsible for administering assessment content, and how much evidence should be gathered to support the evaluation process. Although there have been advances in assessment practices, there is still room for improvement. From the student's, resident's and practising physician's perspectives, assessments need to be relevant. Knowledge is certainly required, but there are other qualities and attributes that are important, and perhaps far more important. Research efforts spent now on delineating what makes a good physician, and on aligning new and upcoming assessment tools with the relevant competencies, will ensure that assessment practices, whether aimed at establishing competence or at fostering learning, are effective with respect to their primary goal: to produce qualified physicians.
© 2018 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

Entities:  

Mesh:

Year:  2018        PMID: 30216508     DOI: 10.1111/medu.13652

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  16 in total

1.  Legitimizing Continuous Quality Improvement (CQI): Navigating Rationality in Undergraduate Medical Education.

Authors:  Terry D Stratton
Journal:  J Gen Intern Med       Date:  2019-05       Impact factor: 5.128

2.  Assessing the Validity and Reliability of the Pharmacist Interprofessional Competencies Tool.

Authors:  Lisa A Salvati; Lisa M Meny; Margaret C de Voest; David R Bright; Kari L Vavra-Janes; Mark A Young; Shelby E Kelsh; Michelle J Sahr; Greg S Wellman
Journal:  Am J Pharm Educ       Date:  2020-07       Impact factor: 2.047

3.  Four ways to get a grip on making robust decisions from workplace-based assessments.

Authors:  Tim J Wilkinson
Journal:  Can Med Educ J       Date:  2022-07-06

4.  The clinical learning environment in anaesthesiology in Kerala---Is it good enough?---A web-based survey.

Authors:  Priyanka Pavithran; Suvarna Kaniyil; M C Rajesh; Vijish Venugopal; T N Jitin; Azeem Davul
Journal:  Indian J Anaesth       Date:  2021-03-13

Review 5.  Artificial Intelligence Education and Tools for Medical and Health Informatics Students: Systematic Review.

Authors:  A Hasan Sapci; H Aylin Sapci
Journal:  JMIR Med Educ       Date:  2020-06-30

6.  Students' self-assessment of achievement of terminal competency and 4-year trend of student evaluation on outcome-based education.

Authors:  Sanghee Yeo; Bong Hyun Chang
Journal:  Korean J Med Educ       Date:  2019-03-01

7.  Implementation of a large-scale simulation-based cardiovascular clinical examination course for undergraduate medical students - a pilot study.

Authors:  Dimitri Arangalage; Jérémie Abtan; Jean Gaschignard; Pierre-François Ceccaldi; Sid-Ahmed Remini; Isabelle Etienne; Philippe Ruszniewski; Patrick Plaisance; Victoire De Lastours; Agnès Lefort; Albert Faye
Journal:  BMC Med Educ       Date:  2019-09-18       Impact factor: 2.463

8.  Evaluating the effectiveness of undergraduate clinical education programs.

Authors:  John W Ragsdale; Andrea Berry; Jennifer W Gibson; Christiane R Herber-Valdez; Lauren J Germain; Deborah L Engle
Journal:  Med Educ Online       Date:  2020-12

9.  Cognitive levels in testing knowledge in evidence-based medicine: a cross sectional study.

Authors:  Ivan Buljan; Matko Marušić; Ružica Tokalić; Marin Viđak; Tina Poklepović Peričić; Darko Hren; Ana Marušić
Journal:  BMC Med Educ       Date:  2021-01-07       Impact factor: 2.463

10.  Attitudes towards medical artificial intelligence talent cultivation: an online survey study.

Authors:  Dongyuan Yun; Yifan Xiang; Zhenzhen Liu; Duoru Lin; Lanqin Zhao; Chong Guo; Peichen Xie; Haotian Lin; Yizhi Liu; Yuxian Zou; Xiaohang Wu
Journal:  Ann Transl Med       Date:  2020-06
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.