Literature DB >> 34235059

Impact of viewing conditions on the performance assessment of different computer monitors used for dental diagnostics.

Thomas Hastie1, Sascha Venske-Parker1, Johan K M Aps2.   

Abstract

PURPOSE: This study aimed to assess the computer monitors used for analysis and interpretation of digital radiographs within the clinics of the Oral Health Centre of Western Australia.
MATERIALS AND METHODS: In total, 135 computer monitors (3 brands, 6 models) were assessed by analysing the same radiographic image of a combined 13-step aluminium step wedge and the Artinis CDDent 1.0® (Artinis Medical Systems B.V.®, Elst, the Netherlands) test object. The number of steps and cylindrical objects observed on each monitor was recorded along with the monitor's make, model, position relative to the researcher's eye level, and proximity to the nearest window. The number of window panels blocked by blinds, the outside weather conditions, and the number of ceiling lights over the surgical suite/cubicle were also recorded. MedCalc® version 19.2.1 (MedCalc Software Ltd®, Ostend, Belgium, https://www.medcalc.org; 2020) was used for statistical analyses (Kruskal-Wallis test and stepwise regression analysis). The level of significance was set at P<0.05.
RESULTS: Stepwise regression analysis showed that only the monitor brand and proximity of the monitor to a window had a significant impact on the monitor's performance (P<0.05). The Kruskal-Wallis test showed significant differences (P<0.05) in monitor performance for all variables investigated, except for the weather and the clinic in which the monitors were placed.
CONCLUSION: The vast performance variation present between computer monitors implies the need for a review of monitor selection, calibration, and viewing conditions.
Copyright © 2021 by Korean Academy of Oral and Maxillofacial Radiology.

Entities:  

Keywords:  Dental Digital Radiography; Diagnostic Imaging

Year:  2021        PMID: 34235059      PMCID: PMC8219454          DOI: 10.5624/isd.20200182

Source DB:  PubMed          Journal:  Imaging Sci Dent        ISSN: 2233-7822


Introduction

Radiographs are mainstream in day-to-day dental practice, serving as a fundamental tool for hard tissue analysis. For over a decade, there has been a growing shift from film to digital radiography, with it being no longer a “matter of if, rather a matter of when most dental practices will be using digital imaging.”1 Digital radiology requires the optimisation of several technical stages, from image acquisition to display, to maximise the diagnostic capacity of a captured image. Unlike film, the functions of acquisition and display in digital imaging are clearly separable. Degradation, at any point, results in the loss of electronic information, which can adversely affect image quality and diagnostic outcomes.123 The final stage in the digital radiographic process is evaluating the acquired data on a computer monitor. It is imperative that the monitor does not compromise the quality of the image and that it depicts the true clinical status of the area captured, which can only be achieved by the portrayal of subtle variations in low-contrast details. Failure to do so may compromise patient care through radiographic misinterpretation or misdiagnosis, resulting in a potential lack of necessary intervention or the prescription of inappropriate treatment.4567 Unfortunately, despite its imperative role in the digital radiography chain, it is not uncommon for the quality of the monitor to be the ’weakest link’ in the dental practice. Although the advent of digital radiology has seamlessly aided the transition of computers into dental practice, with roles already existing in appointment scheduling, billing, and patient charting, the addition of image retrieval, storage, manipulation, and display places yet another set of requirements on the monitors and their viewing conditions.16 Optimal viewing conditions are critical for recognising minute changes between normal and abnormal anatomy. It is well established that surrounding ambient light levels, in conjunction with the monitor luminance, affect the reliability of radiographic interpretation.68910111213 The findings not only suggest that lower illuminance levels significantly improve diagnostic accuracy, but that they should be recommended for daily practice.6 At this point, it is important to highlight and appreciate the difference between medical and dental radiographic imaging. Rarely in medicine does the clinician responsible for the patient's course of care also capture, interpret, and report the findings of a radiograph. That job is left to the radiographer and radiologist. The opposite is common in dentistry. Whilst in medicine the radiologist optimises the environment with equipment ideal for maximising the diagnostic quality of a captured image, in dentistry this is rarely the case. In dentistry, radiographic analysis is commonly performed chair-side, in front of the patient and under less-than-ideal viewing conditions. The surgical suite is designed for clinical work requiring high levels of illuminance. Altering light levels whilst viewing radiographs is usually not possible or inconvenient.614 Monitor properties vary widely from “off-the-shelf” monitors to high-resolution medical monitors. With the current velocity of technology turnover and advancement, considerable debate has raged over the effectiveness of low-cost, “off-the-shelf” computer monitors as alternatives to high-performance, high-cost, “medical-grade” monochrome and grey-scale monitors. To capture the market, dentally configured medical-grade monitors are available that have been engineered to the resolution and grey scale of dental radiographs.151617 However, conflict is evident throughout the literature regarding their benefit for radiographic diagnostics over off-the-shelf monitors.24569151618192021 An advantage of medical-grade monitors over their off-the-shelf counterparts is their capacity to adjust the brightness levels depending on ambient light (auto-calibration) to recommended standards and guidelines. This ensures that the inherent electronic information contained within the digital image is uniformly presented irrespective of the age and/or type of monitor, thereby maximising the diagnostic information presented. Unfortunately, not only are these standards not mandated worldwide, but they were developed for medical, not dental radiology. Minimal research has inquired into the benefits of adopting these medical guidelines in dental clinics. To the extent of the authors' knowledge, in Australia there are no specifications for display monitors used for radiographic interpretation in dentistry. The closest the authors could find, and had access to, was the Standards of Practice for Clinical Radiology published by the Royal Australian and New Zealand College of Radiologists.4715222324 Intra- and inter-monitor diagnostic performance varies greatly, motivating the development of standard guidelines. Monitor settings (resolution, contrast, luminance) and other monitor-related factors (age, condition, grey-scale, reflection, noise, geometric distortion, and veiling glare) all influence the quality of the diagnostic information that can be extracted from a radiograph. Although some of these parameters can be adjusted directly to improve the monitor performance, some cannot. The literature highlights the importance of employing standards for diagnostic radiology along with ongoing calibration efforts. The question is, however, how often is this applied?47152225 Although monitor performance is not the only factor that may impact on the quality of a digital radiograph, it is a pivotal cog in the digital radiology process.22 The purpose of the current study was to assess the performance of all over-the-shelf computer monitors used for digital radiographic interpretation and analysis within the 7 clinical spaces at the Oral Health Centre of Western Australia (OHCWA) and to assess factors contributing to the monitor performance. The following null hypotheses were put forward: 1) there would be no significant difference in performance between the computer monitors in the OHCWA clinics; 2) ambient light would have no significant impact on the performance of the computer monitors in the OHCWA clinics; 3) computer monitor height relative to the observer's eyeline would have no significant impact on the performance of the computer monitors in the OHCWA clinics. This study aimed to determine the quality and differences in image performance of the computer monitors used in the OHCWA clinics.

Materials and Methods

Approval to conduct research at the OHCWA and an ethics exemption was obtained from the Head of the University of Western Australia (UWA) Dental School and the UWA Human Research Ethics office, respectively (RA/4/20/5380). Two independent test objects, a 13-step aluminium step-wedge and a contrast-detail phantom (Artinis CDDent 1.0® Artinis Medical Systems B.V.®, Elst, the Netherlands), were amalgamated into 1 test object. At the time of radiation exposure, the aluminium step-wedge was aligned on top of, and adjacent to, the long edge of the Artinis CDDent 1.0® (Fig. 1). The aluminium step-wedge provided thirteen 2 mm incremental steps (width 5 mm×depth 3 mm×height 26 mm). The Artinis CDDent 1.0® is an aluminium base with a total area of 40×30 mm, an effective area of 25×16 mm, and uniform 3-mm thickness, and has 120 cylindrical objects of differing depths (0.4–0.7 mm) in 10 exponential steps. The diameter of the objects also varied in 10 exponential steps from 0.1 to 1.0 mm. The combination of these 2 phantoms will from now on be referred to as “the test object.”
Fig. 1

A. A 13-step aluminium step-wedge. B. CDDent analyser by Artinis®. Test objects on top of the photostimulable phosphor storage plate seen from above (C) and seen from the side (D). E. Placement of a beam-limiting device over the test object and photostimulable phosphor storage plate for exposure. F. Final image of the test object, which was then projected onto the computer monitor for assessment.

The test object was placed on top of and in the centre of a size 4 previously unexposed ScanX Intraoral Phosphor Plate® (PSP) (Air Techniques®, Melville, NY, USA) (Fig. 1). A Planmeca ProXTM™ (Planmeca®, Helsinki, Finland) intraoral X-ray unit (70 kV max, 2.5 mm EqAl) with an accompanying tube house assembly and beam-limiting device (100 mm length with 60 mm diameter) generated the radiation exposure (0.2 s, 7 mA, 60 kV). The diameter of the beam-limiting device was such that it could be lowered over the top of the test object and be <10 mm from the PSP. The collective X-ray unit was positioned so that as close to a 90° angle with the PSP as possible was achieved (Fig. 1). Planmeca Romexis® version 4.5.0.R (Planmeca®, Helsinki, Finland) imaging software was used to store and display the image of the captured radiograph. The radiograph was cropped within the software so that an image was obtained of only the test object and the irrelevant areas of the PSP that were exposed were removed from the image (Fig. 1). No alterations to the image other than its cropping were undertaken. Four monitor companies, with a total of 7 models, were incorporated into the present study (Table 1). Six models had their performance analysed, whereas 1 was used only in observer calibration exercises. With the exception of brand 7, a medical-grade Barco Eonis 24® (MDRC-2324) (Barco NV®, Kortrijk, Belgium), all monitors assessed were off-the-shelf. The Windows 10 Enterprise® (Microsoft®, Redmond, WA, USA) operating system was used with all monitors included in the present study. All monitors were tested “as found” in their native environment and at their current display resolution. No adjustments were made to monitor settings (contrast, brightness, magnification) or to their surroundings, other than a minor window blind adjustment to the nearest recordable condition.
Table 1

Monitor specifications (per manufacturer)* and period of use prior to assessment

*Monitors were assessed as found irrespective of manufacturers' specifications. Brand 1: LG 24M45HQ® (LG Electronics Inc®, Seoul, Korea); brand 2: LG 24M47VQ-P® (LG Electronics Inc®, Yeouido-dong, Seoul); brand 3: HP E243® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 4: HP Compaq LA1951g® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 5: HP Hstnd-2321-a (L1910)® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 6: Gigabyte P57W® (Gigabyte Technology®, Xiandian District, New Taipei City, Taiwan); brand 7: Barco Eonis 24® (MDRC-2324) (Barco NV®, Kortrijk, Belgium)35363738394041

Two researchers individually assessed the test object on 9 different monitors (7 LG 24M47VQ-P® (LG Electronics Inc®, Seoul, Korea) and 2 Barco Eonis 24® (MDRC-2324) (Barco NV®, Kortrijk, Belgium). The number of cylindrical objects counted and the number of steps of the aluminium step-wedge observed were immediately entered into an Excel® version 16.40 (Microsoft®, Redmond, WA, USA) spreadsheet. This exercise was repeated 5 times, but not consecutively, with the results then statistically compared with those obtained by an experienced dental and maxillofacial radiologist to calculate the inter- and intra-rater agreement. The inter-rater agreement was 98.8% and 96.3% for the cylindrical object count and step count, respectively. Combined agreement with the benchmark was 94.6% for the cylindrical object count and 96.4% for the step count. These results guaranteed reliable data collection. Two researchers individually assessed the test object on 9 different monitors (7 LG 24M47VQ-P® (LG Electronics Inc®, Seoul, Korea) and 2 Barco Eonis 24® (MDRC-2324) (Barco NV®, Kortrijk, Belgium). The number of cylindrical objects counted and the number of steps of the aluminium step-wedge observed were immediately entered into an Excel® version 16.40 (Microsoft®, Redmond, WA, USA) spreadsheet. This exercise was repeated 5 times, but not consecutively, with the results then statistically compared with those obtained by an experienced dental and maxillofacial radiologist to calculate the inter- and intra-rater agreement. The inter-rater agreement was 98.8% and 96.3% for the cylindrical object count and step count, respectively. Combined agreement with the benchmark was 94.6% for the cylindrical object count and 96.4% for the step count. These results guaranteed reliable data collection. In total, 6 clinical environmental conditions were recorded on an Excel® version 16.40 spreadsheet for each computer monitor at the time of assessment. Each condition comprised numerous possibilities that were given a numerical value (Table 2).
Table 2

Data collection and surrounding conditions key (score stands for categorical number)

The conditions recorded included monitor make and model, the number of window panels blocked by blind(s), the monitor's proximity to a window, the number of lights over the surgical suite or cubicle, the monitor's position relative to the researcher's eye level in his or her natural head position, and the outside weather conditions. Windows and lights were only considered if they were incorporated within the boundaries (walls, doors and ceiling) of the suite/cubicle, or were within close proximity to the monitor such that a significant impact on performance was likely (Fig. 2). The external walls of the OHCWA comprised 94-122 cm glass panels. When glass panels are part of a surgical suite/cubicle wall, 4 panels are present in a 2-by-2 configuration (Fig. 2). This allowed the window blinds to be adjusted and recorded to the nearest number of panels they covered.
Fig. 2

Illustration of the clinic's blind set-up covering as found upon inspection. Clockwise, these photos show no blocking of glass window panels, 1 glass window panel being blocked, 2 vertical glass window panels being blocked, 2 horizontal glass window panels being blocked, 3 blocked glass window panels, and all 4 glass window panels being blocked.

Monitor assessment via image analysis was completed either sitting or standing depending on the location of the monitor and how they would be viewed during regular clinical practice. The investigators positioned themselves directly in line with the centre-most point of the monitor, often sitting on a Series 90 Dental Operator® chair (Arteil®, O'Connor, Western Australia, Australia) raised to its maximum height of 135 cm. If a Series 90 Dental Operator® chair was not available, the investigators adjusted whatever operator chair was present in the surgical suite/cubicle to a height of 135 cm. If the clinical situation was that clinicians would be standing in front of the monitor, the assessment was done standing. Eye-to-screen distance was between 40 and 80 cm for all assessed monitors. When selected, the image would be viewed under the “zoom-to-fit” option. Results were entered immediately into separate Excel® spreadsheets. Both investigators' spreadsheets were later combined into one for statistical comparison. Statistical analysis of the data was performed using Med-Calc® version 19.2.1 (MedCalc Software Ltd, Ostend, Belgium; https://www.medcalc.org; 2020) statistical software. Descriptive statistics were generated to analyse the collected data further. Stepwise regression analysis was performed to determine which environmental variables had a significant influence on the measurements (number of cylindrical objects counted and number of steps of the aluminium step-wedge observed). The Kruskal-Wallis test was used to investigate the individual variables' impact on the cylindrical object counts, as the data were not normally distributed. The level of significance was set at P<0.05.

Results

For every monitor analysed, all 13 steps of the aluminium step-wedge were observed, regardless of the surrounding environmental conditions. Across the 135 monitors tested, the mean number of cylindrical objects counted on the Artinis CDDent 1.0® test object was 51.1±6.3 (ranging from 27.5 to 59.5). Tables 3, 4, 5, 6, 7, 8 and 9 show the descriptive statistics for the variables' influence on the number of cylindrical objects that could be distinguished. Stepwise regression analysis, with all variables included, showed that both the brand of the monitor and the presence of a window near the monitor had a statistically significant impact on the number of cylindrical objects that could be distinguished on the monitor (P<0.05).
Table 3

Descriptive statistics for cylindrical object counts with regard to the number of window panels blocked

SD: standard deviation

Table 4

Descriptive statistics for cylindrical object counts with regard to window proximity near a monitor

SD: standard deviation

Table 5

Descriptive statistics for cylindrical object counts with regard to lights present in the clinical cubicle where the monitor was located

SD: standard deviation

Table 6

Descriptive statistics for cylindrical object counts with regard to weather conditions affecting the light present in the clinical cubicle where the monitor was located

SD: standard deviation

Table 7

Descriptive statistics according to the monitor position with respect to the observer's eye level

SD: standard deviation

Table 8

Descriptive statistics according to the monitor brand and model

SD: standard deviation. Brand 1: LG 24M45HQ® (LG Electronics Inc®, Seoul, Korea); brand 2: LG 24M47VQ-P® (LG Electronics Inc®, Yeouido-dong, Seoul); brand 3: HP E243® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 4: HP Compaq LA1951g® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 5: HP Hstnd-2321-a (L1910)® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 6: Gigabyte P57W® (Gigabyte Technology®, Xiandian District, New Taipei City, Taiwan); brand 7: Barco Eonis 24® (MDRC-2324) (Barco NV®, Kortrijk, Belgium)

Table 9

Descriptive statistics of each clinical space in the dental school

SD: standard deviation

The influence of each of the variables on the number of cylindrical objects that could be distinguished was investigated by performing the Kruskal-Wallis test. Neither the clinic in which the monitors were placed nor the weather had a significant impact on the number of cylindrical objects distinguishable on the computer monitors. Table 10 shows the relevant descriptive statistics of the Kruskal-Wallis test results.
Table 10

Useful descriptive statistics to accompany the Kruskal-Wallis test results

Brand 1: LG 24M45HQ® (LG Electronics Inc®, Seoul, Korea); brand 2: LG 24M47VQ-P® (LG Electronics Inc®, Yeouido-dong, Seoul); brand 3: HP E243® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 4: HP Compaq LA1951g® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 5: HP Hstnd-2321-a (L1910)® (Hewlett-Packard Company®, Palo Alto, CA, USA); brand 6: Gigabyte P57W® (Gigabyte Technology®, Xiandian District, New Taipei City, Taiwan); brand 7: Barco Eonis 24® (MDRC-2324) (Barco NV®, Kortrijk, Belgium)

With regard to window panels being blocked by blinds or not, it was found that significantly more cylindrical objects were detectable when 3 window panels were blocked, compared to when only 2 panels were blocked; when there was no window versus when a window was present with 2 or 4 window panels blocked; and when there was no window versus when a window was present with no panels blocked. It was also found that when there was no window close to the monitor (N=78), statistically significantly more (P<0.05) cylindrical objects could be detected (52.5±5.9) than when there was a window in close proximity (N=57) (49.1±6.5). Regarding the number of lights over the surgical suite/cubicle, it was found that significantly fewer cylindrical objects were counted when there were 4 lights (48.7±7.0) compared to 3 lights (52.4±6.0, P<0.05), 2 lights (51.7±6.1, P<0.05), and 1 light (53.6±4.6, P<0.05). The mean number of cylindrical objects counted at each monitor viewing position was as follows: 52.6±4.5 at eye level (N=61), 45.1±8.9 above eye level (N=4) and 50.0±7.2 below eye level (N=70). Significantly fewer cylindrical objects could be detected when the monitor was above eye level than when the monitor was at eye level (P<0.05). The mean number of cylindrical objects counted for each make and model is presented in Table 10. Statistically significantly fewer cylindrical objects could be detected on brand 3 and 6 monitors than on brand 1, 2, 4 and 5 monitors.

Discussion

In modern dental radiography, there is a close relationship between computer monitor performance and digital image quality.15 It is critical that quality assurance tests be carried out regularly on all digital equipment, including computer monitors, to assess whether the performance is within acceptable limits. This ensures that high-quality images are consistently presented, enabling an adequate diagnosis and appropriate clinical intervention. To date, no published studies have investigated the radiographic diagnostic performance of computer monitors at dental schools. Thus, a gap in the literature exists. The OHCWA is both a dental school and a public dental health clinic, with more monitors used for radiographic diagnosis of dental conditions than any other institution in Western Australia. It was therefore a prime choice of location for this study. As digital radiography has become more prevalent in the medical field, it is vital to optimise monitor viewing conditions for improved diagnostic accuracy. This study rejected all 3 levels of the null hypothesis postulated: ambient light, monitor brand, and viewing height all had an impact on the computer monitor performance, measured as cylindrical object observations from the Artinis CDDent 1.0® phantom. Illuminance, or ambient light, is a photometric term used to describe the rate at which light is absorbed by a display device and is associated with an increase in computer monitor reflection.715 Excess reflection interferes with an observer's perception of contrast and can have ramifications on the diagnostic capability of digital radiographs.4 Potential implications of this in daily dental practice include failed detection of carious lesions confined to enamel1 or misinterpretation of anatomical structures and restorative materials with similar radiopacity to natural tooth structure.8 The result of this study supports that, wherever possible, ambient lighting should be reduced in a dental setting to keep monitor reflection to a minimum and to optimise monitor performance.615 When assessing the effect of room illuminance on caries diagnostic accuracy of digital dental radiographs, Pakkala et al.6 found that observers obtained higher sensitivities in settings with lower ambient light. However, this was accompanied by lower specificity, leading to the conclusion that the overall accuracy of radiographic caries detection was not significantly different under various illuminance levels. In 2018, Cruz et al.26 concluded that room lighting had little influence on the radiographic appearance of endodontically treated teeth; however, they also stated that the worst observer performance was seen when assessing the homogeneity of root canal filling material on a laptop in bright light conditions. To optimise monitor viewing conditions, one must also ensure that the display is positioned ergonomically to avoid strain of the neck and back at the user's viewing level.7 Among workstation design considerations, monitor placement is one of the most commonly identified risk factors for neck and shoulder pain.2728 To minimise this, the American Association of Physicists in Medicine (AAPM) recommends that the centre of the display should be slightly below eye level.7 Other guidelines293031 regarding visual display terminals (VDTs) support this; recommending a visual envelope of 0° to 60° below eye level as the optimum display viewing zone. Our study results indicate indeed that a monitor mounted above eye level resulted in significantly lower performance than when a monitor was mounted at eye level (P=0.037). There was no significant difference between above and below eye level, nor was there one between at or below eye level. Of course, a monitor's performance is not only dependent on its viewing conditions, but is also intrinsic to the monitor itself. Three different monitor makes and 6 different models were tested in the present study, with the Gigabyte P57W® (Gigabyte Technology®, Xiandian District, New Taipei City, Taiwan) (brand 6) laptop performing worst when compared to all other models, followed by the HP E243® (Hewlett-Packard Company®, Palo Alto, CA, USA) (brand 3). Digital radiographic image quality was not found to be significantly different between any of the other makes and models assessed (Table 10). It is important to consider that the sample size for each monitor make and model was not equal (Table 10), so the statistical interpretation should be scrutinised and seen in perspective. Monitor-related factors that can influence the quality of image display include monitor make, model, age, condition, luminance response, geometric distortion, and noise.1525 Ambient light reflection is lower in displays with thinner faceplates (e.g., LCDs) and/or a matte finish, compared to those with thicker faceplates (e.g., CRTs) and/or a glossy finish.732 With all these factors in mind, it is no surprise that variation was observed across the monitors tested. Regarding the age of the monitors, one can derive from Table 1 that both the worst performing brands, brand 3 and brand 6, were less than 3 months in use at the authors' institution, compared to several months and years for the other brands and models. Therefore, it can be concluded that the age of the monitors did not affect the results. These results highlight the importance of taking all environmental and monitor-related factors into consideration prior to replacing digital equipment, as it appears that a decision to replace monitors based solely on age is not appropriate. It also supports the necessity for regular quality assurance tests to assess monitor performance. Since luminance decreases as a function of the burning time of fluorescent backlights, unnecessary luminance degradation can be minimised by switching off monitors when not in use.33 The monitors at the authors' institution are seldom switched off. The use of dual screen workstations has become increasingly popular to allow for greater viewing space and altered screen layout. However, purchasing twice as many monitors comes at a greater financial cost. Thus, operators may opt to compromise on quality by purchasing monitors at a lower price, which can have diagnostic implications. Although this study found digital radiographic image quality to be significantly different between various monitors, this was not the case for other studies62126 carried out in a dental setting. Hellen-Halme et al.21 and Pakkala et al.6 found no significant difference between the accuracy of different display types for radiographic caries diagnosis. Cruz et al.26 focussed specifically on the radiographic appearance of endodontic treatment and suggested that even a smartphone would be an acceptable image display device under standard dental clinic lighting conditions. It is important to note, however, that these studies tested diagnostic accuracy using clinical radiographs, rather than using a radiographic test object to compare digital image quality as in the present study. A limitation in the methodology of this study is that the results were obtained via subjective image analysis rather than an objective software-driven assessment of image quality. However, it was carried out by 2 observers with excellent inter- and intra-examiner reliability. Thus, the current study results can be considered statistically sound. The test object used for monitor assessment comprised 2 separate test objects; a custom-made 13-step aluminium step-wedge and the Artinis CDDent 1.0®. Although step-wedges have been widely utilised in the medical and dental fields as a means of monitor performance assessment,1334 the aluminium step-wedge used in this study was found to be of minimal value. All 13 steps could be observed on all monitors tested, regardless of the environmental conditions. From this, one can conclude that the 13-step aluminium step-wedge was not an appropriate tool for testing monitor performance. In contrast, the step-wedges used as test objects by Schriewer et al.13 and Grassl and Schulze34 each contained boreholes of varying depths, which were more effective for the assessment of radiographic images. The Artinis CDDent 1.0® has a similar borehole design with cylindrical objects of varying depths and was found to be a useful and valuable test object in this study. The authors realise that the current study has some weaknesses. For instance, some monitors were ‘warmed up’ whereas others were not, ambient light was measured qualitatively rather than through quantitative illuminance (lux) measurements, and in the dual monitor set-up only the monitor that displayed images from the X-ray system used at the authors' institution, Planmeca Romexis® version 4.5.0.R, was assessed. The rationale for the latter was that both moni-tors in a pair were identical in age, make, and model and were operated under the same environmental conditions. It was unlikely that an operator would deliberately seek to move Planmeca Romexis® version 4.5.0.R to the paired screen it was not automatically displayed on. After monitor performance was assessed, no adjustments were made to the monitors or their environmental conditions to check whether better performance was possible. Ideally, an attempt should be made to recalibrate any underperforming monitors to the AAPM TG18 and Digital Imaging in Communications in Medicine Part 14 GSDF standards.723 This would imply setting a standard for performance against which one can compare and subsequently adjust monitor settings. Reducing ambient light levels is an effective way to easily improve monitor performance by minimising the loss of image quality associated with reflections on the display face.7 To reduce surface reflection, it is recommended that display devices be oriented in such a way as to avoid direct light sources in their most common viewing positions.7 Other strategies include the addition of light absorbers within the faceplate of the monitor or application of an antireflective coating on its surface.7 Additionally, monitors should be kept clean using appropriate cleaning agents to remove dirt or marks (e.g., fingerprints) that may impede the viewing and interpretation of digital images.25 Computer monitors are a critical piece of diagnostic equipment and should not be treated or valued less than any other piece of equipment in the dental office. Proper education and knowledge about computer monitors is essential. The results of the present study highlight that a multitude of factors impact the quality of a computer monitor's display for radiographic interpretation and analysis. Whilst advancements in digital technology are continuous, it is imperative that clinicians understand how to maximise their computer monitors' diagnostic performance in the first place by taking into account the viewing conditions and environment around the monitor. Manipulation of the environmental settings when assessing radiographic images and calibration of the monitors were not addressed in this study and will be investigated in future UWA research.
  27 in total

1.  Variations in performance of LCDs are still evident after DICOM gray-scale standard display calibration.

Authors:  Joanna M Lowe; Patrick C Brennan; Michael G Evanoff; Mark F McEntee
Journal:  AJR Am J Roentgenol       Date:  2010-07       Impact factor: 3.959

2.  In vitro perception of low-contrast features in digital, film, and digitized dental radiographs: a receiver operating characteristic analysis.

Authors:  Ulrich Grassl; Ralf Kurt Willy Schulze
Journal:  Oral Surg Oral Med Oral Pathol Oral Radiol Endod       Date:  2006-08-02

3.  Carious lesions: diagnostic accuracy using pre-calibrated monitor in various ambient light levels: an in vitro study.

Authors:  K Hellén-Halme; A Lith
Journal:  Dentomaxillofac Radiol       Date:  2013-06-17       Impact factor: 2.419

4.  The effect of aging on luminance of standard liquid crystal display (LCD) monitors.

Authors:  Kristina Hellén-Halme; Bengt Hellén-Halme; Ann Wenzel
Journal:  Oral Surg Oral Med Oral Pathol Oral Radiol Endod       Date:  2011-06-12

5.  Digital display monitor performance in general dental practice.

Authors:  A Butt; N W Savage
Journal:  Aust Dent J       Date:  2015-05-20       Impact factor: 2.291

6.  Role of ambient light in the detection of contrast elements in digital dental radiography.

Authors:  Harald Ohla; Dorothea Dagassan-Berndt; Michael Payer; Andreas Filippi; Ralf Kurt Willy Schulze; Sebastian Kühl
Journal:  Oral Surg Oral Med Oral Pathol Oral Radiol       Date:  2018-08-18

7.  The influence of ambient lighting on the detection of small contrast elements in digital dental radiographs.

Authors:  Till Schriewer; Ralf Schulze; Andreas Filippi; Irene Mischak; Michael Payer; Dorothea Dagassan-Berndt; Sebastian Kühl
Journal:  Clin Oral Investig       Date:  2012-10-07       Impact factor: 3.573

8.  Quality assurance tests for digital radiography in general dental practice.

Authors:  Chris Greenall; Nicholas Drage; Matthew Ager
Journal:  Dent Update       Date:  2014-03

9.  Effect of varying displays and room illuminance on caries diagnostic accuracy in digital dental radiographs.

Authors:  T Pakkala; L Kuusela; M Ekholm; A Wenzel; F Haiter-Neto; M Kortesniemi
Journal:  Caries Res       Date:  2012-08-28       Impact factor: 4.056

10.  Good practice for radiological reporting. Guidelines from the European Society of Radiology (ESR).

Authors: 
Journal:  Insights Imaging       Date:  2011-02-06
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.