| Literature DB >> 18263242 |
Abstract
The quality of an imaging system is degraded by propagation anomalies that distort wavefronts propagating through the medium. Adaptive phase-deaberration algorithms compensate for phase errors in the wavefront. The algorithms suffer, however, when the wavefront is also significantly distorted. A theory which shows that the rise of image background level, which is the average sidelobe floor (ASF), in a single point-like source image is proportional to the amplitude distortion of the wavefront and inversely proportional to the effective number of array elements is derived. From the theory, the tolerance to the amplitude distortion, after the phasefront has been corrected by a deaberration algorithm, can be calculated based on the design requirement of the sidelobe floor for a given array. Computer simulations show good agreement with the theory.Year: 1993 PMID: 18263242 DOI: 10.1109/58.248219
Source DB: PubMed Journal: IEEE Trans Ultrason Ferroelectr Freq Control ISSN: 0885-3010 Impact factor: 2.725