| Literature DB >> 3774474 |
Abstract
One of the issues faced by engineers when designing a system which records an external event and represents it in the form of a digitized image on VDU screen is which type of grey scale to use. An experiment is described which compares, in a simulated digitized image, the effect of a linear and a logarithmic grey scale on the detectability of a straight-line signal embedded in visual noise. It was found that both bright and dark signals were detected more easily with the linear scale. A signal detection theory analysis was carried out to compare human performance with that of an 'ideal' observer who performed the detection task with a filter spatially matched to the signal. It was found that the model of performance for this ideal observer accounted well for the results provided the assumption of a linear transformation of luminance was made. The analysis showed that the superiority of the linear over the logarithmic grey scale was simply due to the higher signal-to-noise ratio of the signals in the former.Entities:
Mesh:
Year: 1986 PMID: 3774474 DOI: 10.1068/p150017
Source DB: PubMed Journal: Perception ISSN: 0301-0066 Impact factor: 1.490