OBJECTIVE: The purposes of this study were to describe the questionnaire development process for evaluating elements of an evidence-based practice (EBP) curriculum in a chiropractic program and to report on initial reliability and validity testing for the EBP knowledge examination component of the questionnaire. METHODS: The EBP knowledge test was evaluated with students enrolled in a doctor of chiropractic program in the University of Western States. The initial version was tested with a sample of 374 and a revised version with a sample of 196 students. Item performance and reliability were assessed using item difficulty, item discrimination, and internal consistency. An expert panel assessed face and content validity. RESULTS: The first version of the knowledge examination demonstrated a low internal consistency (Kuder-Richardson 20 = 0.55), and a few items had poor item difficulty and discrimination. This resulted in an expansion in the number of items from 20 to 40, as well as a revision of the poorly performing items from the initial version. The Kuder-Richardson 20 of the second version was 0.68; 32 items had item difficulties of between 0.20 and 0.80, and 26 items had item discrimination values of 0.20 or greater. CONCLUSIONS: A questionnaire for evaluating a revised EBP-integrated curriculum was developed and evaluated. Psychometric testing of the EBP knowledge component provided some initial evidence for acceptable reliability and validity.
OBJECTIVE: The purposes of this study were to describe the questionnaire development process for evaluating elements of an evidence-based practice (EBP) curriculum in a chiropractic program and to report on initial reliability and validity testing for the EBP knowledge examination component of the questionnaire. METHODS: The EBP knowledge test was evaluated with students enrolled in a doctor of chiropractic program in the University of Western States. The initial version was tested with a sample of 374 and a revised version with a sample of 196 students. Item performance and reliability were assessed using item difficulty, item discrimination, and internal consistency. An expert panel assessed face and content validity. RESULTS: The first version of the knowledge examination demonstrated a low internal consistency (Kuder-Richardson 20 = 0.55), and a few items had poor item difficulty and discrimination. This resulted in an expansion in the number of items from 20 to 40, as well as a revision of the poorly performing items from the initial version. The Kuder-Richardson 20 of the second version was 0.68; 32 items had item difficulties of between 0.20 and 0.80, and 26 items had item discrimination values of 0.20 or greater. CONCLUSIONS: A questionnaire for evaluating a revised EBP-integrated curriculum was developed and evaluated. Psychometric testing of the EBP knowledge component provided some initial evidence for acceptable reliability and validity.
Authors: Ronald P Lefebvre; David H Peterson; Mitchell Haas; Richard G Gillette; Charles W Novak; Janet Tapper; John P Muench Journal: J Chiropr Educ Date: 2011
Authors: William D Hendricson; John D Rugh; John P Hatch; Debra L Stark; Thomas Deahl; Elizabeth R Wallmann Journal: J Dent Educ Date: 2011-02 Impact factor: 2.264
Authors: Julie K Tilson; Sandra L Kaplan; Janet L Harris; Andy Hutchinson; Dragan Ilic; Richard Niederman; Jarmila Potomkova; Sandra E Zwolsman Journal: BMC Med Educ Date: 2011-10-05 Impact factor: 2.463
Authors: Heather Zwickey; Heather Schiffke; Susan Fleishman; Mitch Haas; des Anges Cruser; Ron LeFebvre; Barbara Sullivan; Barry Taylor; Barak Gaster Journal: J Altern Complement Med Date: 2014-12 Impact factor: 2.579