| Literature DB >> 31545009 |
Andrew Murphy1,2,3, Ernest Ekpo3, Thomas Steffens4, Michael J Neep5,6.
Abstract
INTRODUCTION: Radiographer image evaluation methods such as the preliminary image evaluation (PIE), a formal comment describing radiographers' findings in radiological images, are embedded in the contemporary radiographer role within Australia. However, perceptions surrounding both the capacity for Australian radiographers to adopt PIE and the barriers to its implementation are highly variable and seldom evidence-based. This paper systematically reviews the literature to examine radiographic image interpretation by Australian radiographers and the barriers to implementation.Entities:
Keywords: General radiography; image interpretation; radiographer commenting; radiography; systematic review
Mesh:
Year: 2019 PMID: 31545009 PMCID: PMC6920699 DOI: 10.1002/jmrs.356
Source DB: PubMed Journal: J Med Radiat Sci ISSN: 2051-3895
Radiographic interpretation terms as defined in the literature.
| Red dot system | Red sticker affixed to a radiographic film to flag a potential abnormality |
| Radiographer abnormality detection system (RADS) | A ‘flagging’ system in which the radiographer will digitally affix an indicator to a radiographic image to indicate a potential abnormality |
| Preliminary clinical evaluation (PCE), preliminary image evaluation (PIE), radiographer commenting | A brief written comment by a radiographer to communicate what they believe could be an abnormality. Not a definitive radiological report |
| Radiographer report | A definitive radiological report performed by a trained radiographer. |
| Radiologist report | A definitive radiological report |
Figure 1Preferred Reporting Items for Systematic Reviews and Meta‐Analyses flow chart.
Study characteristics, results, and critical appraisal score of the reviewed literature
| Authors | Study type and aim | Participants | Setting & design | Result | Critical appraisal score (out of 15) |
|---|---|---|---|---|---|
| Orames, 1997 |
Cohort study Explore general X‐ray interpretation ability and accuracy of radiographers compared to ED physicians |
Radiographers, differing experience ED physicians, differing experience |
Single site (public hospital) Radiographers and ED physicians completed survey over 1‐month period for each suspected abnormal radiograph 541 emergency radiographs Compared to single radiology report |
| 7 |
| Jane, Hall & Egan 1999 |
Cohort study Investigate radiographer's ability to: detect normal and abnormal radiographs provide a provisional diagnosis in conjunction with RDS |
Radiographers, differing experience |
Single site (public hospital) Conducted over three 1‐month periods in 1992, 1994 and 1997 Radiographer recorded normal, abnormal or ‘don't know’ on a form for each emergency radiological investigation. 1997 trial included a written ‘provisional diagnosis’ 940 examinations (including CT and ultrasound) Compared to single radiology report |
| 6 |
| Younger & Smith, 2002 |
Cohort study Measure the ability of radiographers to provide a provisional diagnosis alongside RDS using an opinion form |
26 radiographers, differing experience |
Single site (public hospital) Participants filled out a pre‐designed opinion form, indicating radiological impression 820 emergency examinations over a 3‐month period Compared to single radiology report |
| 10 |
| Cook, Oliver, & Ramsay, 2004 |
Cohort study Investigate the accuracy and effectiveness of two radiographers in reporting appendicular MSK radiographs in the adult trauma setting |
2 radiographers, 8 and 14 years of experience, with one undertaking a master's degree in radiographic image interpretation |
Single site (public hospital) Both participants undertook image interpretation training at a departmental level Participants shadow reported 527 MSK appendicular radiographs referred from ED Compared to single radiology report |
| 7 |
| Smith, Traise, & Cook, 2009 |
Experimental cohort study Evaluate the accuracy of radiographers in interpreting MSK radiographs Assess the impact of a short continuous education model on accuracy |
16 radiographers, differing experience |
Multisite 400 abnormal radiographs from axial and appendicular skeleton, categorised into three grades of complexity, grouped into test banks of 25 Participants interpreted 25 images using a pre‐designed opinion form to record opinion, observation and an open comment Following an intervention of self‐guided education, the cohort was retested after 4 months using the same 25 images Rating of opinions vs radiologist report was subjectively assessed for clinical significance Compared to single radiology report |
Opinion Pre‐intervention Post‐intervention Observation Pre‐intervention Post‐intervention Open comment Pre‐intervention Post‐intervention | 8 |
| Hardy, Poulos, Emanuel, & Reed, 2010 |
Multisite survey Investigate advanced practice carried out by radiographers in NSW |
69 medical imaging supervisors (response rate 60%, |
Multisite survey (public and private practices) Participants completed questionnaire exploring the use of advanced practice (triage systems, formal or informal reporting, research, cannulation, IV contrast administration) in their respective departments |
39.7% of medical imaging directors stated their sites utilised a RADS Barrier identified There is a perception that radiographers do not possess the relevant knowledge for radiographic image interpretation | 7 |
| Brown & Leschke, 2012 |
Retrospective analysis Evaluate the accuracy and clinical value of the RDS |
Non‐voluntary cohort |
Single site (public hospital) Retrospective audit of 3638 appendicular MSK radiographs from ED over 4‐month period, focusing on fracture detection Assessed for the presence of a ‘red dot’ Images without a red dot, found to be obvious fractures, considered true positives Compared to single radiology report |
Identifying appendicular fractures
A subgroup analysis of non‐displaced fractures (<1 mm displacement)
| 3 |
| McConnell, Devaney, Gordon, Goodwin, Strahan,& Baird, 2012 |
Experimental cohort study Investigate the effect a pilot educational intervention has on radiographer's ability to describe radiographic abnormalities |
10 radiographers, differing experience |
Multisite (public hospital) 102 adult appendicular MSK trauma radiographs Test bank reflected population injury incidence as well as incidence of body region, diagnosis validated by four radiologists Participants filled out a worksheet with tick box and comment components, with a separate field to indicate level of certainty Images assessed before, immediately following, and 8–10 weeks after a tailored education programme |
Pre‐ and immediately post‐education programme
8‐10 weeks following education programme
| 12 |
| McConnell, Devaney, & Gordon, 2013 |
Experimental cohort study Investigate the effect of an educational programme on radiographers’ ability to describe radiographic abnormalities |
10 radiographers, 2 with postgraduate image interpretation education ED physicians, differing experience |
Multisite (public hospital) 655 MSK trauma radiographs obtained from multiple EDs Radiographers underwent education programme regarding interpretation of MSK trauma radiographs Radiographers interpreted images at a real‐time workload in their respective EDs Radiographers filled out a worksheet with a tick box and comment component Compared to ED physician notes and radiologist reports |
No statistically significant difference between radiographers and ED physicians Radiographers
ED physicians ACC 89.5%
| 9 |
| Neep, Steffens, Owen, & McPhail, 2014 |
Multisite survey Investigate radiographer participation in RADS, the perception of radiographer image interpretation in the emergency department Explore perceived barriers, benefits and enablers to radiographer commenting |
73 radiographers, differing experience (response rate 68%, |
Multisite (public hospitals) Cross‐sectional multisite survey consisting of closed‐ and open‐ended questions |
Barriers identified Education access Lack of time Low radiographer confidence Inconsistency of guidelines Radiographer resistance Radiologist perception this is a threat to their role Varying levels of RADS participation identified | 10 |
| Neep, Steffens, Owen, & McPhail, 2014 |
Multisite survey Investigate radiographer perception of both their readiness to participate in a radiographer commenting system and preferred image interpretation education method |
73 radiographers, differing experience (response rate 68%, |
Multisite (public hospital) Cross‐sectional multisite survey of four major metropolitan hospitals Questionnaire consisted of four sections testing demographics, self‐perceived confidence in interpreting radiographs, perceived level of accuracy in interpreting radiographs and the preference for educational delivery of either long term (eight 90‐min sessions) or short term (2‐day intensive course) |
Barrier identified Radiographers are not confident in describing radiographic MSK abnormalities | 10 |
| Page, Bernoth, & Davidson, 2014 |
Interpretative phenomenological study Identify the factors that are hindering or promoting the progression of advanced practice in the Australian radiography profession |
7 participants with a radiography or radiation therapy background, working in senior positions (response rate 70%, |
Multisite (public hospitals) 25 questions were sent out to participants to be examined before an in‐depth verbal interview Questions explored themes relating to barriers and enablers of advanced practice in Australia |
Barriers identified Radiologist perception this is a threat to their role Education Lack of direction from professional bodies Lack of engagement with other professions | 8 |
| Squibb, Bull, Smith, & Dalton, 2015 |
Exploratory interpretive study Explore rural Australian radiographers’ perspective on disclosing opinions regarding radiographic imaging |
185 radiographers from rural workplaces across Australia (response rate 32.4%, |
Multisite (rural centres) Two‐phase study Multisite postal survey distributed to rural NSW, WA and TAS, collecting descriptive data regarding respondent demographics Nine face‐to‐face or phone interviews exploring demographics, attitude regarding radiographic image interpretation |
Barrier identified Lack of understanding around the legality in disclosing opinions | 9 |
| Squibb, Smith, Dalton, & Bull, 2016 |
Two‐phase multisite survey with an interview component Investigate how rural radiographers communicate and disclose their radiographic opinion to referring physicians in their respective departments |
185 radiographers from rural workplaces across Australia (response rate 32.4%, |
Multisite (rural centres) Two‐phase study Multisite postal survey distributed to rural NSW, WA and TAS, collecting descriptive data regarding respondent demographics Nine semistructured interviews to obtain qualitative data regarding interprofessional communication |
Barriers identified Lack of formal education Lack of communication skills to convey abnormal findings Interprofessional boundaries due to a historical hierarchy | 9 |
| McConnell & Baird, 2017 |
Experimental cohort study To measure and compare the ability of final‐year medical students and radiographers in interpreting MSK trauma plain radiographic images |
16 radiographers with at least 2 years of experience working in an ED setting 16 final‐year medical students |
Volunteer participants 650 MSK trauma radiographs selected for assessment 209 radiographs chosen in total to reflect injury prevalence (adult and paediatrics), validated by four radiologists Images sent to participants via a USB device Responses provided via an electronic worksheet |
Radiographers
Medical students
| 15 |
| Neep, Steffens, Riley, Eastgate, & McPhail, 2017 |
Experimental cohort study To develop and examine a valid and reliable test to assess radiographers’ ability to interpret trauma radiographs |
41 radiographers, differing experience (minimum 12 months) |
Volunteer participants Two‐phase study Establish a typical anatomical region case‐mix of trauma radiographs from 14,159 cases, to be developed into image interpretation examination Prospective investigation of its validity and reliability |
Association between participant confidence and test score was positively associated (coefficient = 1.52, | 15 |
| Murphy & Neep, 2018 |
Multisite survey Investigate the use of RADS in QLD public hospitals |
25 medical imaging directors (response rate 89%, |
Multisite (public hospitals) Cross‐sectional web‐based questionnaire Survey explored hospital demographics, the use of RADS in their respective departments and potential factors preventing implementation |
16% of sites had a RADS in place Barriers identified Education Lack of resources Perception of inadequate staff education | 9 |
| Williams, Baird, Pearce, & Schneider, 2019 |
Cohort study Investigate radiographer performance in interpreting appendicular skeletal radiographs after the intervention of two short education modules |
Eight radiographers, differing experience |
Volunteer participants Participants were tested via a random assortment of 25 images (normal to abnormal ratio 25:75) before and after two learning modules The two modules consisted of an upper limb and shoulder girdle portion and a lower limb and pelvic girdle portion Participants tested again 6 months post‐intervention to assess retainment |
Pre‐test module 1 to post‐test module 2 results
6 months post‐intervention
| 10 |
| Neep, Steffens, Eastgate, & McPhail, 2019 |
Randomised control trial Compare two methods of image interpretation education and their effects on radiographers’ ability to interpret radiographs |
42 radiographers with no formal education in image interpretation (minimum 12 months of experience) |
Volunteer participants 60 images from validated test bank – mix of cases, injury prevalence accounted for Cases assessed by two radiologists and 1 reporting radiographer Participants completed a baseline image interpretation examination before being placed into two randomised cohorts Intensive education (2‐day, 13.5‐hour delivery) Non‐intensive education (9 weeks, 90 min a week) Participants were tested again 1 week and 12 weeks post‐intervention to assess retainment |
Baseline median test score
1‐week post‐intervention median test score
12‐week post‐intervention median test score
| 15 |
ACC, accuracy; CT, computed tomography; ED, emergency department; IV, intravenous; MSK, musculoskeletal; NPV, negative predictive value; NSW, New South Wales; PPV, positive predictive value; QLD, Queensland; RADS, radiographer abnormality detection system; RDS, red dot system; ROC, receiver operating characteristic; SENS, sensitivity; SPEC, specificity; TAS, Tasmania; USB, universal serial bus; WA, Western Australia.