Literature DB >> 21533755

Web-assisted assessment of professional behaviour in problem-based learning: more feedback, yet no qualitative improvement?

Walther N K A van Mook1, Arno M M Muijtjens, Simone L Gorter, Jan Harm Zwaveling, Lambert W Schuwirth, Cees P M van der Vleuten.   

Abstract

Although other web-based approaches to assessment of professional behaviour have been studied, no publications studying the potential advantages of a web-based instrument versus a classic, paper-based method have been published to date. This study has two research goals: it focuses on the quantity and quality of comments provided by students and their peers (two researchers independently scoring comments as correct and incorrect in relation to five commonly used feedback rules (and resulting in an aggregated score of the five scores) on the one, and on the feasibility, acceptability and perceived usefulness of the two approaches on the other hand (using a survey). The amount of feedback was significantly higher in the web-based group than in the paper based group for all three categories (dealing with work, others and oneself). Regarding the quality of feedback, the aggregated score for each of the three categories was not significantly different between the two groups, neither for the interim, nor for the final assessment. Some, not statistically significant, but nevertheless noteworthy trends were nevertheless noted. Feedback in the web-based group was more often unrelated to observed behaviour for several categories for both the interim and final assessment. Furthermore, most comments relating to the category 'Dealing with oneself' consisted of descriptions of a student's attendance, thereby neglecting other aspects of personal functioning. The survey identified significant differences between the groups for all questionnaire items regarding feasibility, acceptability and perceived usefulness in favour of the paper-based form. The use of a web-based instrument for professional behaviour assessment yielded a significantly higher number of comments compared to the traditional paper-based assessment. Unfortunately, the quality of the feedback obtained by the web-based instrument as measured by several generally accepted feedback criteria did not parallel this increase.

Entities:  

Mesh:

Year:  2011        PMID: 21533755      PMCID: PMC3274685          DOI: 10.1007/s10459-011-9297-0

Source DB:  PubMed          Journal:  Adv Health Sci Educ Theory Pract        ISSN: 1382-4996            Impact factor:   3.853


  27 in total

1.  Web-based peer evaluation by medical students.

Authors:  J A Freedman; H P Lehmann; C J Ogborn
Journal:  Acad Med       Date:  2000-05       Impact factor: 6.893

2.  How can I know what I don't know? Poor self assessment in a well-defined domain.

Authors:  Kevin W Eva; John P W Cunnington; Harold I Reiter; David R Keane; Geoffrey R Norman
Journal:  Adv Health Sci Educ Theory Pract       Date:  2004       Impact factor: 3.853

3.  Promoting professional behaviour in undergraduate medical, dental and veterinary curricula in the Netherlands: evaluation of a joint effort.

Authors:  Scheltus J van Luijk; Ronald C Gorter; Walther N K A van Mook
Journal:  Med Teach       Date:  2010       Impact factor: 3.650

4.  Medical students' views on peer assessment of professionalism.

Authors:  Louise Arnold; Carolyn K Shue; Barbara Kritt; Shiphra Ginsburg; David T Stern
Journal:  J Gen Intern Med       Date:  2005-09       Impact factor: 5.128

5.  The Professionalism Mini-evaluation Exercise: a preliminary investigation.

Authors:  Richard Cruess; Jodi Herold McIlroy; Sylvia Cruess; Shiphra Ginsburg; Yvonne Steinert
Journal:  Acad Med       Date:  2006-10       Impact factor: 6.893

6.  Evaluation of missing data in an assessment of professional behaviors.

Authors:  Kathleen Mazor; Brian E Clauser; Matthew Holtman; Melissa J Margolis
Journal:  Acad Med       Date:  2007-10       Impact factor: 6.893

7.  Factors inhibiting assessment of students' professional behaviour in the tutorial group during problem-based learning.

Authors:  Walther N K A van Mook; Willem S de Grave; Elise Huijssen-Huisman; Marianne de Witt-Luth; Diana H J M Dolmans; Arno M M Muijtjens; Lambert W Schuwirth; Cees P M van der Vleuten
Journal:  Med Educ       Date:  2007-09       Impact factor: 6.251

8.  Performance during internal medicine residency training and subsequent disciplinary action by state licensing boards.

Authors:  Maxine A Papadakis; Gerald K Arnold; Linda L Blank; Eric S Holmboe; Rebecca S Lipner
Journal:  Ann Intern Med       Date:  2008-06-03       Impact factor: 25.391

9.  Effects of rater selection on peer assessment among medical students.

Authors:  Stephen J Lurie; Anne C Nofziger; Sean Meldrum; Christopher Mooney; Ronald M Epstein
Journal:  Med Educ       Date:  2006-11       Impact factor: 6.251

10.  Use of peer ratings to evaluate physician performance.

Authors:  P G Ramsey; M D Wenrich; J D Carline; T S Inui; E B Larson; J P LoGerfo
Journal:  JAMA       Date:  1993-04-07       Impact factor: 56.272

View more
  3 in total

Review 1.  Failure of faculty to fail failing medical students: Fiction or an actual erosion of professional standards?

Authors:  Salman Y Guraya; Walther N K A van Mook; Khalid I Khoshhal
Journal:  J Taibah Univ Med Sci       Date:  2019-02-01

2.  The utilization of peer feedback during collaborative learning in undergraduate medical education: a systematic review.

Authors:  Sarah Lerchenfeldt; Misa Mi; Marty Eng
Journal:  BMC Med Educ       Date:  2019-08-23       Impact factor: 2.463

3.  Explicit feedback to enhance the effect of an interim assessment: a cross-over study on learning effect and gender difference.

Authors:  Marleen Olde Bekkink; Rogier Donders; Goos N P van Muijen; Rob M W de Waal; Dirk J Ruiter
Journal:  Perspect Med Educ       Date:  2012-09-27
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.