Literature DB >> 26500784

Image rejects in general direct digital radiography.

Bjørn Hofmann1, Tine Blomberg Rosanowsky2, Camilla Jensen2, Kenneth Hong Ching Wah2.   

Abstract

BACKGROUND: The number of rejected images is an indicator of image quality and unnecessary imaging at a radiology department. Image reject analysis was frequent in the film era, but comparably few and small studies have been published after converting to digital radiography. One reason may be a belief that rejects have been eliminated with digitalization.
PURPOSE: To measure the extension of deleted images in direct digital radiography (DR), in order to assess the rates of rejects and unnecessary imaging and to analyze reasons for deletions, in order to improve the radiological services.
MATERIAL AND METHODS: All exposed images at two direct digital laboratories at a hospital in Norway were reviewed in January 2014. Type of examination, number of exposed images, and number of deleted images were registered. Each deleted image was analyzed separately and the reason for deleting the image was recorded.
RESULTS: Out of 5417 exposed images, 596 were deleted, giving a deletion rate of 11%. A total of 51.3% were deleted due to positioning errors and 31.0% due to error in centering. The examinations with the highest percentage of deleted images were the knee, hip, and ankle, 20.6%, 18.5%, and 13.8% respectively.
CONCLUSION: The reject rate is at least as high as the deletion rate and is comparable with previous film-based imaging systems. The reasons for rejection are quite different in digital systems. This falsifies the hypothesis that digitalization would eliminates rejects. A deleted image does not contribute to diagnostics, and therefore is an unnecessary image. Hence, the high rates of deleted images have implications for management, training, education, as well as for quality.

Entities:  

Keywords:  Digital radiography; ethics; radiation safety; technology assessments

Year:  2015        PMID: 26500784      PMCID: PMC4601124          DOI: 10.1177/2058460115604339

Source DB:  PubMed          Journal:  Acta Radiol Open


Introduction

Rejects, deletions, and subsequent retakes of diagnostic X-ray images impose professional and ethical challenges within radiological imaging (1); it occupies unnecessary processing and personnel resources (2–5), indicates suboptimal quality management (6–8), and exposes patients to unnecessary ionizing radiation and added inconveniences (9). Traditionally reject/deletion/retake rates for film-based departments have been documented to be in the range of 10–15% (8,10–17), and their main cause has been attributed incorrect exposures due to limited dynamic range of screen/film systems. Accordingly, the digitalization of medical imaging induced expectations that the problem of image rejects, deletions, and retakes would disappear (5–7,17,18). A series of research papers have reported reject/deletion/retake rates in digital departments at around 5% (6–8,15,17,19,20), and some even at the same rate as with film systems (4). This poses the question whether the reject rates really are as high as with film systems and why the problem did not vanish with the digital revolution, as presumed. Image digitalization significantly changed the causes of rejects. While rejects for the screen/film systems mostly were exposure-related, they are now mainly reported to relate to patient positioning errors in the digital systems. Although there are some studies with reject rates for computed radiography (CR) systems (6,8,17,19,21–23), there are few studies for direct digital radiographic (DR) systems (4). Although, one would expect the reject rates of DR systems to be below film reject rates, initial studies indicate that this is not so (4,22,24,25). In order to assess whether the high reject rate with DR only are incidental findings or represent a real challenge in digital imaging more studies are needed. The reject rate in this study was defined as images deleted on modality specific work stations or in the PACS. Accordingly, the research questions of this article are: How high is the deletion rate for DR systems, and what are the reasons for deletions?

Material and Methods

This study was registered and conducted as a Quality Assurance Project of the hospital, and is as such not subject to informed consent from patients according to the Norwegian Patient Rights Act. Employees at the Radiology Department were informed about the study in advance. Access to images and systems was supervised by the Radiology Department. Confidentiality statement was valid for the data collection. Data were collected at two laboratories for general X-ray examinations at the radiological department of a local public hospital in the central southern part of Norway. The department makes about 25,000 general X-ray examinations per year. The two DR laboratories are part of the same department and the department’s radiographers are shared between the two laboratories. Data included all exposed images during January 2014. A registration form was developed on basis of existing literature (5,10,24–26). Some adjustments resulted from a pilot study. The registered categories are given as follows: Positioning error (other than centering errors) Incorrect collimation Centering error Wrong exposure Artifacts Other reasons Centering errors were differentiated from other positioning errors in order to be able to tailor education and improvement strategies. A “centering error” occurs when the object of interest is not in the center of the image, while other errors of position, such as rotation errors, are categorized as “positioning error”. Images can be deleted either at the workstation of the modality or in PACS. Images deleted on the workstations can be counted directly, as these are tagged. However, in order to collect data on additional deletions in PACS, the number of images on PACS and workstations were compared for each examination. In this study, image rejects are defined as images that do not contribute with diagnostic information with regards to the relevant clinical indication due to poor image quality (5) and they are measured as deleted images, as a deleted image has no diagnostic value as it per se is not used for diagnostic purposes. Accordingly, a deleted image is defined as an image that is deleted from the data registry either at the workstation of the modality or from the PACS (after being transferred from the workstation). A more detailed description of the relationship between deleted images, image rejects, image retakes, and unnecessary imaging can be found in the Appendix. Data collection was performed during evening time in order not to influence the workflow or the deletion rate. Deleted images were categorized by two persons or three persons when there was doubt. Descriptive statistics was used with Microsoft Excel 2010 to calculate deletion rate and confidence intervals. A detailed description of the X-ray equipment and PACS is given in the Appendix.

Results

In total, 1911 examinations with 5417 images were registered during January 2014. Of these 596 images were deleted during this period. Accordingly, the deletion rate was 11.0% (95% CI, 10.2–11.8]. There were 24 different types of examinations. Table 1 shows the number of examinations and deletions for the 10 most frequent examinations.
Table 1.

The number of images and deletions for the 10 most frequent types of examinations.

Images deleted (%)Images (n)Deleted images (n)
Knee20.6% (95% CI, 17.3–23.9)591122
Hip18.50%28753
Ankle13.80%50770
Wrist12.4% (95% CI, 9.7–15.1)55569
Columna11.20%48354
Shoulder9.40%44542
Pelvis and hip8.20%45237
Thorax6.9% (95% CI, 4.9–8.9)62243
Foot6.20%32420
Hand3.60%41615
The number of images and deletions for the 10 most frequent types of examinations. The main reason for deletion was positioning errors (51.3%) and centering errors (31.0%). The identified reasons for deletion are displayed in Table 2; Table 3 shows the distribution of identified reasons for deletion on various examination types.
Table 2.

Distribution of identified reasons for deletion.

Category of reason for deletionPercentage
Positioning error51.3%
Centering error31.0%
Other*8.6%
Incorrect collimation6.4%
Artefacts2.2%
Wrong exposure0.5%

It was not possible to decide why the image was deleted.

Table 3.

Distribution of identified reasons for deletion on various examination types.

Examination typePositioning errorCentering error
Knee77.9%9.0%
Hip3.8%81.1%
Ankle72.9%12.9%
Wrist91.3%5.8%
Columna27.8%59.3%
Shoulder59.5%19.0%
Pelvis and hip5.4%62.2%
Thorax27.9%37.2%
Foot35.0%35.0%
Hand60.0%13.3%
Distribution of identified reasons for deletion. It was not possible to decide why the image was deleted. Distribution of identified reasons for deletion on various examination types.

Discussion

Our results show a deletion rate which is quite high compared to international studies on CR systems (6–8,11,17,19,21), but very much in line with existing Norwegian studies. Leffmann et al. found a deletion rate of 13.1% for wrist images with a CR system (27), and Andersen et al. found a reject rate of 17% for wrist images with DR (4), while we found a deletion rate of 12.4% (95% CI, 9.7–15.1). In line with both Leffmann and Andersen’s findings, our study shows that the main reason for deletion of wrist images was positioning errors. Leffmann’s study does not report whether deletions in PACS are included. If they are not, as the article indicates, their real rate may be significantly higher. This also goes for Andersen’s study which does not include deletions in PACS. Therefore, the real reject rate may be higher (4). Our overall results are also in line with the overall deletion rate of 12.5% found in 2009 at one of the labs included in our study (25), and 12% found in the study by Andersen and colleagues (4). The finding show that the deletion rate is on level with the retake rate with film systems, but the reasons for deletions are different: from incorrect exposure to positioning error. This can indicate poorer quality of work among radiographers. There are some discrepancies in the results of the reasons for deleting images in our and Andersen’s study. For example, Andersen et al. found an overall positioning error of 77% while our results showed 82.3% (centering and other positioning errors) (4). This may of course be due to real differences between the sites, but can also be due to difference in interpretations of the categories and the mode of registration. In Andersen et al.’s study the radiographers registered the reason for deletion themselves, while we registered a retrospective interpretation of the radiographers’ reasons for deleting the images. This weakness in our study is only relevant for the interpretation of the reasons for deletions, and not for the deletion rate, where our study is more complete than comparable studies (4). Hence, there is a tradeoff between the validity of the results on reject rate and on reasons for rejects. The categories of reasons for deletions are quite coarse in our study. Radiographers may have more subtle reasons for deleting, which cannot be identified by the study. However, the pilot study showed that a more detailed list of reasons was not feasible with the interpretative method chosen. Nevertheless, our categories correspond well with those of other studies. In addition to registering the type of examination, it is valuable to have information on the projections of the deleted images. This study has not measured unnecessary imaging, but only how large proportion of the images that were deleted. However, a deleted image has no diagnostic value as it per se is not used for diagnostic purposes. It is therefore unnecessary. The number of deleted digital images will therefore be an underestimation of image reject, of retakes, and of unnecessary imaging, simply because many original non-used images are not deleted. Nevertheless, the number of deleted images provides a useful estimate of the lowest possible rate of unnecessary imaging. If the number of deleted images is high, the number of unnecessary images is alarming. Fig. 1 illustrates the relationship between the number of rejects, retakes, and unnecessary images. There are of course many reasons why images are not deleted: abundant storing capacity; one forgets to delete them; one believes that they may be of some value in the future; the old image may in the end showed up to be better than the new one; time pressure; or because deleting too many pictures would give the impression of poor quality work.
Fig. 1.

The relationship between unnecessary images, retakes, rejects, and deleted images.

The relationship between unnecessary images, retakes, rejects, and deleted images. In conclusion, we find a deletion rate of 11%. This indicates that the reject and the retake rate, as well as the rate of unnecessary images is higher than 11%. We found deletion rates comparable with reject rates of previous film based imaging systems, but that the reasons for reject rates are different. This falsifies the hypothesis that rejects and retakes would be abolished with digitalization of radiographs. For some examination types the deletion rate is over 20% and the main reasons for deletions are positioning and centering errors (together 82.3%). Monitoring unnecessary images is highly relevant to verify and improve the quality in modern radiographic imaging. It is of great importance for management, training, education, and for quality improvement.
  22 in total

1.  Comparative reject analysis in conventional film-screen and digital storage phosphor radiography.

Authors:  S Peer; R Peer; S M Giacomuzzi; W Jaschke
Journal:  Radiat Prot Dosimetry       Date:  2001       Impact factor: 0.972

2.  The X Raying of America.

Authors:  J C Villforth
Journal:  FDA Consum       Date:  1979 Dec-1980 Jan

3.  Reject analysis in direct digital radiography.

Authors:  Eivind Richter Andersen; Jannike Jorde; Nadia Taoussi; Sadia Halima Yaqoob; Bente Konst; Therese Seierstad
Journal:  Acta Radiol       Date:  2012-01-27       Impact factor: 1.990

4.  An audit of rejected repeated x-ray films as a quality assurance element in a radiology department.

Authors:  K C Eze; N Omodia; B Okegbunam; T Adewonyi; C C Nzotta
Journal:  Niger J Clin Pract       Date:  2008-12       Impact factor: 0.968

5.  Image retake analysis in digital radiography using DICOM header information.

Authors:  C Prieto; E Vano; J I Ten; J M Fernandez; A I Iñiguez; N Arevalo; A Litcheva; E Crespo; A Floriano; D Martinez
Journal:  J Digit Imaging       Date:  2008-07-01       Impact factor: 4.056

6.  Digital radiography reject analysis: data collection methodology, results, and recommendations from an in-depth investigation at two hospitals.

Authors:  David H Foos; W James Sehnert; Bruce Reiner; Eliot L Siegel; Arthur Segal; David L Waldman
Journal:  J Digit Imaging       Date:  2008-04-30       Impact factor: 4.056

7.  Films reject analysis for conventional radiography in Iranian main hospitals.

Authors:  R Roohi Shalemaei
Journal:  Radiat Prot Dosimetry       Date:  2011-07-14       Impact factor: 0.972

8.  Repeat exposures: our little secret.

Authors:  W E McKinney
Journal:  Radiol Technol       Date:  1994 May-Jun

9.  Digital versus screen-film mammography: a retrospective comparison in a population-based screening program.

Authors:  Boel Heddson; Katarina Rönnow; Magnus Olsson; David Miller
Journal:  Eur J Radiol       Date:  2007-03-26       Impact factor: 3.528

Review 10.  Strategies for dose reduction in ordinary radiographic examinations using CR and DR.

Authors:  C E Willis
Journal:  Pediatr Radiol       Date:  2004-10
View more
  8 in total

1.  Detecting Technical Image Quality in Radiology Reports.

Authors:  Thusitha Mabotuwana; Varun S Bhandarkar; Christopher S Hall; Martin L Gunn
Journal:  AMIA Annu Symp Proc       Date:  2018-12-05

2.  Educational Module Intervention for Radiographers to Reduce Repetition Rate of Routine Digital Chest Radiography in Makkah Region of Saudi Arabia Tertiary Hospitals: Protocol of a Quasi-Experimental Study.

Authors:  Rosliza Abdul Manaf; Abdullah A Almalki; Muhamad Hanafiah Juni; Hayati Kadir Shahar; Noramaliza Mohd Noor; Abdelsafi Gabbad
Journal:  JMIR Res Protoc       Date:  2017-09-26

3.  Reject rate analysis in digital radiography: an Australian emergency imaging department case study.

Authors:  Samantha Atkinson; Michael Neep; Deborah Starkey
Journal:  J Med Radiat Sci       Date:  2019-07-18

4.  The assessment of image quality and diagnostic value in X-ray images: a survey on radiographers' reasons for rejecting images.

Authors:  Elin Kjelle; Catherine Chilanga
Journal:  Insights Imaging       Date:  2022-03-04

5.  MRI interobserver reliability in rectal tumor angulation.

Authors:  Malene Rv Pedersen; Peter O Otto; Helle Precht; Søren R Rafaelsen
Journal:  Acta Radiol Open       Date:  2022-02-23

6.  Digital radiograph rejection analysis during "Coronavirus disease 2019 (COVID-19) pandemic" in a tertiary care public sector hospital in Khyber Pakhtunkhwa Province of Pakistan.

Authors:  Amir Ali; Muhammad Yaseen
Journal:  Chin J Acad Radiol       Date:  2021-06-07

7.  Repeat analysis of intraoral digital imaging performed by undergraduate students using a complementary metal oxide semiconductor sensor: An institutional case study.

Authors:  Mohd Yusmiaidil Putera Mohd Yusof; Nur Liyana Abdul Rahman; Amiza Aqiela Ahmad Asri; Noor Ilyani Othman; Ilham Wan Mokhtar
Journal:  Imaging Sci Dent       Date:  2017-12-12

8.  Visualizing the Invisible: Invisible Waste in Diagnostic Imaging.

Authors:  Bjørn Hofmann; Eivind Richter Andersen; Elin Kjelle
Journal:  Healthcare (Basel)       Date:  2021-12-07
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.