| Literature DB >> 35368447 |
Clare Teng1, Harshita Sharma1, Lior Drukker2, Aris T Papageorghiou2, J Alison Noble1.
Abstract
We present a method for classifying tasks in fetal ultrasound scans using the eye-tracking data of sonographers. The visual attention of a sonographer captured by eye-tracking data over time is defined by a scanpath. In routine fetal ultrasound, the captured standard imaging planes are visually inconsistent due to fetal position, movements, and sonographer scanning experience. To address this challenge, we propose a scale and position invariant task classification method using normalised visual scanpaths. We describe a normalisation method that uses bounding boxes to provide the gaze with a reference to the position and scale of the imaging plane and use the normalised scanpath sequences to train machine learning models for discriminating between ultrasound tasks. We compare the proposed method to existing work considering raw eyetracking data. The best performing model achieves the F1-score of 84% and outperforms existing models.Entities:
Keywords: Eye-tracking; fetal ultrasound; time-series classification; visual scanpath
Year: 2021 PMID: 35368447 PMCID: PMC7612565 DOI: 10.1007/978-3-030-87583-1_13
Source DB: PubMed Journal: Simpl Med Ultrasound (2021)