Forough Farrokhyar1, Deepak Dath2, Nalin Amin2, Mohit Bhandari3, Stephen Kelly2, Ann Kolkin2, Catherine Gill Pottruff2, Susan Reid2. 1. Department of Surgery, McMaster University, 39 Charlton Ave. E., Hamilton, ON L8N IY3, Canada; Deparmtent of Clinical Epidemiology and Biostatistics, McMaster University, 39 Charlton Ave. E., Hamilton, ON L8N IY3, Canada. Electronic address: farrokh@mcmaster.ca. 2. Department of Surgery, McMaster University, 39 Charlton Ave. E., Hamilton, ON L8N IY3, Canada. 3. Department of Surgery, McMaster University, 39 Charlton Ave. E., Hamilton, ON L8N IY3, Canada; Deparmtent of Clinical Epidemiology and Biostatistics, McMaster University, 39 Charlton Ave. E., Hamilton, ON L8N IY3, Canada.
Abstract
BACKGROUND: There are currently no validated guidelines to assess the quality of the content and the delivery style of scientific podium surgical presentations. We have developed a simple, short, and reliable instrument to objectively assess the overall quality of scientific podium presentations. METHODS: A simple and efficient rating instrument was developed to assess the scientific content and presentation style/skills of the surgical residents' presentations from 1996 to 2013. Absolute and consistency agreement for the different sections of the instrument was determined and assessed overtime, by stage of the project and study design. Intraclass correlation coefficients with 95% confidence intervals were calculated and reported using a mixed-effects model. RESULTS: Inter-rater reliability for both absolute and consistency agreement was substantial for total score and for each of the 3 sections of the instrument. The absolute agreement for the overall rating of the presentations was .87 (.63 to .98) and .78 (.50 to .95), and the consistency agreement was .90 (.70 to .99) and .87 (.67 to .97) for the 2012 and 2013 institutional research presentations, respectively. Rater agreement for evaluating project stage and different study designs varied from .70 to .81 and was consistent over the years. The consistency agreement in rating of the presentation was .77 for both faculty and resident raters. CONCLUSIONS: Standardized methodological assessment of research presentations (SHARP) instrument rates the scientific quality of the research and style of the delivered presentation. It is highly reliable in scoring the quality of the all study designs regardless of their stage. We recommend that researchers focus on presenting the key concepts and significant elements of their evidence using visually simple slides in a professionally engaging manner for effective delivery of their research and better communication with the audience.
BACKGROUND: There are currently no validated guidelines to assess the quality of the content and the delivery style of scientific podium surgical presentations. We have developed a simple, short, and reliable instrument to objectively assess the overall quality of scientific podium presentations. METHODS: A simple and efficient rating instrument was developed to assess the scientific content and presentation style/skills of the surgical residents' presentations from 1996 to 2013. Absolute and consistency agreement for the different sections of the instrument was determined and assessed overtime, by stage of the project and study design. Intraclass correlation coefficients with 95% confidence intervals were calculated and reported using a mixed-effects model. RESULTS: Inter-rater reliability for both absolute and consistency agreement was substantial for total score and for each of the 3 sections of the instrument. The absolute agreement for the overall rating of the presentations was .87 (.63 to .98) and .78 (.50 to .95), and the consistency agreement was .90 (.70 to .99) and .87 (.67 to .97) for the 2012 and 2013 institutional research presentations, respectively. Rater agreement for evaluating project stage and different study designs varied from .70 to .81 and was consistent over the years. The consistency agreement in rating of the presentation was .77 for both faculty and resident raters. CONCLUSIONS: Standardized methodological assessment of research presentations (SHARP) instrument rates the scientific quality of the research and style of the delivered presentation. It is highly reliable in scoring the quality of the all study designs regardless of their stage. We recommend that researchers focus on presenting the key concepts and significant elements of their evidence using visually simple slides in a professionally engaging manner for effective delivery of their research and better communication with the audience.