Boris Volkov1, Goldie MacDonald2, Dionisio Herrera3, Donna Jones2, Mahomed Patel4. 1. Centers for Disease Control and Prevention and Oak Ridge Institute for Science and Education. 2. Centers for Disease Control and Prevention. 3. Training Programs in Epidemiology and Public Health Interventions Network. 4. Australian National University.
Abstract
BACKGROUND: Often evaluations of training programs are limited - with many focusing on the aspects that are easy to measure (e.g., reaction of trainees) without addressing the important outcomes of training, such as how trainees applied their new knowledge, skills, and attitudes. Numerous evaluations fail to measure training's effect on job performance because few effective methods are available to do so. Particularly difficult is the problem of evaluating multisite training programs that vary considerably in structure and implementation from one site to another. PURPOSE: NA. SETTING: NA. INTERVENTION: NA. RESEARCH DESIGN: We devised a method of a consensus expert review to evaluate the quality of conference abstracts submitted by participants in Field Epidemiology Training Programs - an approach that can provide useful information on how well trainees apply knowledge and skills gained in training, complementing data obtained from other sources and methods. This method is practical, minimally intrusive, and resource-efficient, and it may prove useful for evaluation practice in diverse fields that require training. DATA COLLECTION AND ANALYSIS: NA. FINDINGS: NA.
BACKGROUND: Often evaluations of training programs are limited - with many focusing on the aspects that are easy to measure (e.g., reaction of trainees) without addressing the important outcomes of training, such as how trainees applied their new knowledge, skills, and attitudes. Numerous evaluations fail to measure training's effect on job performance because few effective methods are available to do so. Particularly difficult is the problem of evaluating multisite training programs that vary considerably in structure and implementation from one site to another. PURPOSE: NA. SETTING: NA. INTERVENTION: NA. RESEARCH DESIGN: We devised a method of a consensus expert review to evaluate the quality of conference abstracts submitted by participants in Field Epidemiology Training Programs - an approach that can provide useful information on how well trainees apply knowledge and skills gained in training, complementing data obtained from other sources and methods. This method is practical, minimally intrusive, and resource-efficient, and it may prove useful for evaluation practice in diverse fields that require training. DATA COLLECTION AND ANALYSIS: NA. FINDINGS: NA.
Authors: Denise A Traicoff; Henry T Walke; Donna S Jones; Eric K Gogstad; Rubina Imtiaz; Mark E White Journal: Public Health Rep Date: 2008 Impact factor: 2.792
Authors: Brian H Rowe; Trevor L Strome; Carol Spooner; Sandra Blitz; Eric Grafstein; Andrew Worster Journal: BMC Med Res Methodol Date: 2006-03-19 Impact factor: 4.615