Literature DB >> 32993627

Barriers to obtaining reliable results from evaluations of teaching quality in undergraduate medical education.

Zemiao Zhang1,2, Qi Wu1,2, Xinping Zhang1,2, Juyang Xiong1,2, Lan Zhang1,2, Hong Le3,4.   

Abstract

BACKGROUND: Medical education is characterized by numerous features that are different from other higher education programmes, and evaluations of teaching quality are an integral part of medical education. Although scholars have made extensive efforts to enhance the quality of teaching, various issues unrelated to teaching that interfere with the accuracy of evaluation results remain. The purpose of this study is to identify the barriers that prevent objective and reliable results from being obtained during the evaluation process.
METHODS: This study used mixed methods (3 data sources) to collect opinions from different stakeholders. Based on purposive sampling, 16 experts familiar with teaching management and 12 s- and third-year students were invited to participate in interviews and discussions, respectively. Additionally, based on systematic random sampling, 74 teachers were invited to complete a questionnaire survey. All qualitative data were imported into NVivo software and analysed using thematic analysis in chronological order and based on grounded theory. Statistical analyses of the questionnaire results were conducted using SPSS software.
RESULTS: Sixty-nine valid questionnaires (93.24%) were recovered. A total of 29 open codes were extracted, and 14 axial codes were summarized and divided into four selective codes: evaluation preparation, the index system, the operation process, and the consequences of evaluation. The main barriers to obtaining reliable evaluation results included inadequate attention, unreasonable weighting, poor teaching facilities, an index without pertinence and appropriate descriptions, bad time-points, incomplete information on the system, lagged feedback, and disappointing result application. Almost all participants suggested lowering the weight of students as subjects, with a weight of 50-60% being appropriate. Students showed dissatisfaction with evaluation software, and the participants disagreed over the definition of good teaching and the management of student attendance.
CONCLUSIONS: This study reveals the difficulties and problems in current evaluations of teaching in medical education. Collecting data from multiple stakeholders helps in better understanding the evaluation process. Educators need to be aware of various issues that may affect the final results when designing the evaluation system and interpreting the results. More research on solutions to these problems and the development of a reasonable evaluation system is warranted.

Entities:  

Keywords:  Evaluation of teaching quality; Interviews; Issues unrelated to teaching; Medical education; Undergraduate

Mesh:

Year:  2020        PMID: 32993627      PMCID: PMC7523339          DOI: 10.1186/s12909-020-02227-w

Source DB:  PubMed          Journal:  BMC Med Educ        ISSN: 1472-6920            Impact factor:   2.463


  28 in total

1.  Are online student evaluations of faculty influenced by the timing of evaluations?

Authors:  John A McNulty; Gregory Gruener; Arcot Chandrasekhar; Baltazar Espiritu; Amy Hoyt; David Ensminger
Journal:  Adv Physiol Educ       Date:  2010-12       Impact factor: 2.288

2.  Teaching diversity to medical undergraduates: Curriculum development, delivery and assessment. AMEE GUIDE No. 103.

Authors:  Nisha Dogra; Farah Bhatti; Candan Ertubey; Moira Kelly; Angela Rowlands; Davinder Singh; Margot Turner
Journal:  Med Teach       Date:  2015-12-07       Impact factor: 3.650

3.  Making sense of grounded theory in medical education.

Authors:  Tara J T Kennedy; Lorelei A Lingard
Journal:  Med Educ       Date:  2006-02       Impact factor: 6.251

4.  Evaluation of Faculty Mentoring Practices in Seven U.S. Dental Schools.

Authors:  Thikriat Al-Jewair; Amy Kristina Herbert; V Leroy Leggitt; Tawana Lee Ware; Maritzabel Hogge; Cynthia Senior; Rebecca K Carr; John D Da Silva
Journal:  J Dent Educ       Date:  2019-08-12       Impact factor: 2.264

5.  Student-centered, modernized graduate STEM education.

Authors:  Alan I Leshner
Journal:  Science       Date:  2018-06-01       Impact factor: 47.728

6.  The quality of quality criteria: Replicating the development of the Consolidated Criteria for Reporting Qualitative Research (COREQ).

Authors:  Niels Buus; Amelie Perron
Journal:  Int J Nurs Stud       Date:  2019-10-24       Impact factor: 5.837

7.  Faculty development for the evaluation system: a dual agenda.

Authors:  Kellee L Oller; Cuc T Mai; Robert J Ledford; Kevin E O'Brien
Journal:  Adv Med Educ Pract       Date:  2017-03-08

8.  Correction to: Interactional skills training in undergraduate medical education: ten principles for guiding future research.

Authors:  Rob Sanson-Fisher; Breanne Hobden; Mariko Carey; Lisa Mackenzie; Lisa Hyde; Jan Shepherd
Journal:  BMC Med Educ       Date:  2019-07-23       Impact factor: 2.463

9.  Students' perceptions towards self-directed learning in Ethiopian medical schools with new innovative curriculum: a mixed-method study.

Authors:  Haftom Hadush Kidane; Herma Roebertsen; Cees P M van der Vleuten
Journal:  BMC Med Educ       Date:  2020-01-08       Impact factor: 2.463

10.  Contextual adaptation of the Personnel Evaluation Standards for assessing faculty evaluation systems in developing countries: the case of Iran.

Authors:  Soleiman Ahmady; Tahereh Changiz; Mats Brommels; F Andrew Gaffney; Johan Thor; Italo Masiello
Journal:  BMC Med Educ       Date:  2009-04-28       Impact factor: 2.463

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.