Agatha M Hettinga1, Eddie Denessen1, Cornelis T Postma1. 1. Radboud University Nijmegen Medical Centre, Academic Educational Institute, Nijmegen, the NetherlandsBehavioural Science Institute, Radboud University Nijmegen, Nijmegen, the NetherlandsDepartment of General Internal Medicine and Academic Educational Institute, Radboud University Nijmegen Medical Centre, Nijmegen, the Netherlands.
Abstract
OBJECTIVES: Research on objective structured clinical examinations (OSCEs) is extensive. However, relatively little has been written on the development of case-specific checklists on history taking and physical examination. Background information on the development of these checklists is a key element of the assessment of their content validity. Usually, expert panels are involved in the development of checklists. The objective of this study is to compare expert-based items on OSCE checklists with evidence-based items identified in the literature. METHODS: Evidence-based items covering both history taking and physical examination for specific clinical problems and diseases were identified in the literature. Items on nine expert-based checklists for OSCE examination stations were evaluated by comparing them with items identified in the literature. The data were grouped into three categories: (i) expert-based items; (ii) evidence-based items, and (iii) evidence-based items with a specific measure of their relevance. RESULTS: Out of 227 expert-based items, 58 (26%) were not found in the literature. Of 388 evidence-based items found in the literature, 219 (56%) were not included in the expert-based checklists. Of these 219 items, 82 (37%) had a specific measure of importance, such as an odds ratio for a diagnosis, making that diagnosis more or less probable. CONCLUSIONS: Expert-based, case-specific checklist items developed for OSCE stations do not coincide with evidence-based items identified in the literature. Further research is needed to ascertain what this inconsistency means for test validity.
OBJECTIVES: Research on objective structured clinical examinations (OSCEs) is extensive. However, relatively little has been written on the development of case-specific checklists on history taking and physical examination. Background information on the development of these checklists is a key element of the assessment of their content validity. Usually, expert panels are involved in the development of checklists. The objective of this study is to compare expert-based items on OSCE checklists with evidence-based items identified in the literature. METHODS: Evidence-based items covering both history taking and physical examination for specific clinical problems and diseases were identified in the literature. Items on nine expert-based checklists for OSCE examination stations were evaluated by comparing them with items identified in the literature. The data were grouped into three categories: (i) expert-based items; (ii) evidence-based items, and (iii) evidence-based items with a specific measure of their relevance. RESULTS: Out of 227 expert-based items, 58 (26%) were not found in the literature. Of 388 evidence-based items found in the literature, 219 (56%) were not included in the expert-based checklists. Of these 219 items, 82 (37%) had a specific measure of importance, such as an odds ratio for a diagnosis, making that diagnosis more or less probable. CONCLUSIONS: Expert-based, case-specific checklist items developed for OSCE stations do not coincide with evidence-based items identified in the literature. Further research is needed to ascertain what this inconsistency means for test validity.
Authors: Susan Glover Takahashi; Arthur Rothman; Marla Nayer; Murray B Urowitz; Anne Marie Crescenzi Journal: Can Fam Physician Date: 2012-07 Impact factor: 3.275