Literature DB >> 28149124

Analysis of reporting quality for oral presentations of observational studies at 19th National Surgical Congress: Proposal for a national evaluation system.

Mustafa Hasbahçeci1, Fatih Başak2, Aylin Acar2, Abdullah Şişik2.   

Abstract

OBJECTIVE: To compare the quality of oral presentations presented at the 19th National Surgical Congress with a national evaluation system with respect to the applicability of systems, and consistency between systems and reviewers.
MATERIAL AND METHODS: Fifty randomly selected observational studies, which were blinded for author and institute information, were evaluated by using the Strengthening the Reporting of Observational Studies (STROBE), Timmer Score, and National Evaluation System by two reviewers. Abstract scores, evaluation periods, and compatibility between reviewers were compared for each evaluation system. Abstract scores by three different evaluation systems were regarded as the main outcome. Wilcoxon matched-pairs signed rank and Friedman tests for comparison of scores and times, kappa analysis for compatibility between reviewers, and Spearman correlation for analysis of reviewers based on pairs of evaluation systems were used.
RESULTS: There was no significant difference between abstract scores for each system (p>0.05). A significant difference for evaluation period of reviewers was detected for each system (p<0.05). Compatibility between reviewers was the highest for the Timmer Score (medium, κ=0.523), and the compatibility for STROBE and National Evaluation System was regarded as acceptable (κ=0.394 and κ=0.354, respectively). Assessment of reviewers for pairs of evaluation systems revealed that scores increased in the same direction with each other significantly (p<0.05).
CONCLUSION: The National Evaluation System is an appropriate method for evaluation of conference abstracts due to the consistent results between the referees similarly with the current international evaluation systems and ease of applicability with regard to evaluation period.

Keywords:  Abstract; congress; reporting quality

Year:  2016        PMID: 28149124      PMCID: PMC5245719          DOI: 10.5152/UCD.2016.3195

Source DB:  PubMed          Journal:  Ulus Cerrahi Derg        ISSN: 1300-0705


  13 in total

1.  Reproducibility of peer review in clinical neuroscience. Is agreement between reviewers any greater than would be expected by chance alone?

Authors:  P M Rothwell; C N Martyn
Journal:  Brain       Date:  2000-09       Impact factor: 13.501

2.  Assessment of abstracts submitted to the annual scientific meeting of the Royal Australian and New Zealand College of Radiologists.

Authors:  S Bydder; K Marion; M Taylor; J Semmens
Journal:  Australas Radiol       Date:  2006-08

3.  The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies.

Authors:  Erik von Elm; Douglas G Altman; Matthias Egger; Stuart J Pocock; Peter C Gøtzsche; Jan P Vandenbroucke
Journal:  J Clin Epidemiol       Date:  2008-04       Impact factor: 6.437

4.  Evaluation of reporting quality of the 2010 and 2012 National Surgical Congress oral presentations by CONSORT, STROBE and Timmer criteria.

Authors:  Mustafa Hasbahçeci; Fatih Başak; Ömer Uysal
Journal:  Ulus Cerrahi Derg       Date:  2014-09-01

5.  Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): explanation and elaboration.

Authors:  Jan P Vandenbroucke; Erik von Elm; Douglas G Altman; Peter C Gøtzsche; Cynthia D Mulrow; Stuart J Pocock; Charles Poole; James J Schlesselman; Matthias Egger
Journal:  Int J Surg       Date:  2014-07-18       Impact factor: 6.071

6.  Assessment of reporting quality of conference abstracts in sports injury prevention according to CONSORT and STROBE criteria and their subsequent publication rate as full papers.

Authors:  Uzung Yoon; Karsten Knobloch
Journal:  BMC Med Res Methodol       Date:  2012-04-11       Impact factor: 4.615

7.  Quality of reporting according to the CONSORT, STROBE and Timmer instrument at the American Burn Association (ABA) annual meetings 2000 and 2008.

Authors:  Karsten Knobloch; Uzung Yoon; Hans O Rennekampff; Peter M Vogt
Journal:  BMC Med Res Methodol       Date:  2011-11-29       Impact factor: 4.615

8.  Reviewer agreement trends from four years of electronic submissions of conference abstract.

Authors:  Brian H Rowe; Trevor L Strome; Carol Spooner; Sandra Blitz; Eric Grafstein; Andrew Worster
Journal:  BMC Med Res Methodol       Date:  2006-03-19       Impact factor: 4.615

9.  Inter-rater agreement in the scoring of abstracts submitted to a primary care research conference.

Authors:  Alan A Montgomery; Anna Graham; Philip H Evans; Tom Fahey
Journal:  BMC Health Serv Res       Date:  2002-03-26       Impact factor: 2.655

10.  Development and evaluation of a quality score for abstracts.

Authors:  Antje Timmer; Lloyd R Sutherland; Robert J Hilsden
Journal:  BMC Med Res Methodol       Date:  2003-02-11       Impact factor: 4.615

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.