| Literature DB >> 24358192 |
Abstract
The field of sonification has progressed greatly over the past twenty years and currently constitutes an established area of research. This article aims at exploiting and organizing the knowledge accumulated in previous experimental studies to build a foundation for future sonification works. A systematic review of these studies may reveal trends in sonification design, and therefore support the development of design guidelines. To this end, we have reviewed and analyzed 179 scientific publications related to sonification of physical quantities. Using a bottom-up approach, we set up a list of conceptual dimensions belonging to both physical and auditory domains. Mappings used in the reviewed works were identified, forming a database of 495 entries. Frequency of use was analyzed among these conceptual dimensions as well as higher-level categories. Results confirm two hypotheses formulated in a preliminary study: pitch is by far the most used auditory dimension in sonification applications, and spatial auditory dimensions are almost exclusively used to sonify kinematic quantities. To detect successful as well as unsuccessful sonification strategies, assessment of mapping efficiency conducted in the reviewed works was considered. Results show that a proper evaluation of sonification mappings is performed only in a marginal proportion of publications. Additional aspects of the publication database were investigated: historical distribution of sonification works is presented, projects are classified according to their primary function, and the sonic material used in the auditory display is discussed. Finally, a mapping-based approach for characterizing sonification is proposed.Entities:
Mesh:
Year: 2013 PMID: 24358192 PMCID: PMC3866150 DOI: 10.1371/journal.pone.0082491
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Figure 1The classification was elaborated through a brainstorming using affinity diagrams.
Each low-level dimension was written on a post-it note. The notes were then moved to form clusters based on their degree of similarity, constituting the intermediate-level dimensions used to reference mappings in this systematic review for both physical and auditory domains.
Intermediate-level conceptual dimensions in the physical domain.
| Label | Physical dimension | Category |
| P01 | Location | Kinematics |
| P02 | Velocity | |
| P03 | Acceleration | |
| P04 | Jerkiness | |
| P05 | Distance | |
| P06 | Orientation | |
| P07 | Motion | |
| P08 | Energy | Kinetics |
| P09 | Intensity | |
| P10 | Force | |
| P11 | Temperature | |
| P12 | Activity | |
| P13 | Pressure | |
| P14 | Signal amplitude | |
| P15 | Material | Matter |
| P16 | Density | |
| P17 | Mass | |
| P18 | Transmission characteristics | |
| P19 | Reflection characteristics | |
| P20 | Roughness | |
| P21 | Color hue | |
| P22 | Color saturation | |
| P23 | Color luminosity | |
| P24 | Time elapsed | Time |
| P25 | Phase | |
| P26 | Event rate | |
| P27 | Signal frequency | |
| P28 | Signal spectral energy distribution | |
| P29 | Volume | Dimensions |
| P30 | Size | |
| P31 | Area | |
| P32 | Length | |
| P33 | Shape |
List of physical dimensions sonified in the articles from the publication database, arranged according to their corresponding high-level category.
Intermediate-level conceptual dimensions in the auditory domain.
| Label | Auditory dimension | Category |
| A01 | Pitch | Pitch-related |
| A02 | Pitch range | |
| A03 | Timbre | Timbral |
| A04 | Instrumentation | |
| A05 | Polyphonic content | |
| A06 | Voice gender | |
| A07 | Allophone | |
| A08 | Spectral power | |
| A09 | Amplitude of harmonic | |
| A10 | Frequency of harmonic | |
| A11 | Roughness | |
| A12 | Brightness | |
| A13 | Center frequency of filter | |
| A14 | Saliency | |
| A15 | Loudness | Loudness-related |
| A16 | Dynamic loudness | |
| A17 | Spatialization | Spatial |
| ⋅ | Stereo panning | |
| ⋅ | Multichannel panning | |
| ⋅ | Vector base amplitude panning | |
| ⋅ | Head-related transfer function | |
| ⋅ | Ambisonics | |
| ⋅ | Interaural time difference | |
| ⋅ | Interaural amplitude difference | |
| ⋅ | Interaural frequency difference | |
| ⋅ | Non-specified spatialization method | |
| A18 | Doppler effect | |
| A19 | Tempo | Temporal |
| A20 | Duration | |
| ⋅ A201 | Rhythmic duration | |
| ⋅ A202 | Event duration | |
| ⋅ A203 | Ambient duration | |
| ⋅ A204 | Non-specified duration scale | |
| A21 | Sequential position | |
| A22 | Melody lead | |
| A23 | Articulation | |
| A24 | Decay time | |
| A25 | Melody | Pitch-related, Temporal |
| A26 | Harmony | Pitch-related, Timbral |
| A27 | Chord progression | Pitch-related, Timbral, Temporal |
| A28 | Spectral duration | Timbral, Temporal |
| A29 | Reverberation time | Spatial, Temporal |
| A30 | Performance activity level | Loudness-related, Temporal |
List of auditory dimensions used for sonification in the articles from the publication database, arranged according to their corresponding high-level category. Dimensions belonging to more than one high-level category are displayed at the bottom of the table. The multi-class dimension Spatialization is distinguished from the others using a star (*) in its label, which also incorporates an index differentiating the different subclasses. Similarly, the label of the multi-scale dimension Duration incorporates an index differenciating the different scales.
Description of the projects analyzed in this study.
| Project number | List of publications | Summary of the work from a sonification perspective | Sonic material | Mapping references |
|
| ||||
| 1 |
| Interactive sonification of the motion of aquarium fishes and ants | MIDI protocol, digital synthesizers, samplers, piece of music, Max/MSP | P01 |
| 2 |
| Model for the sonification of large datasets with application to seismic data | MIDI protocol, Sound Blaster MIDI synthesizer, environmental sounds | P01 |
| 3 |
| Sonification of acoustic and audio data | Various stimuli including filtered noise, pure and complex tones, and clicks | P14 |
| 4 |
| Interactive sonification of colored images and video clips for the design of a mobility aid | Spatialized instrument sounds | P01 |
| 5 |
| Sonification of video clips of counter-movement jumps for studying multimodal integration | Synthesized voice, amplitude and frequency modulated tone | P10 |
| 6 |
| Interactive sonification as a help for positioning surging instruments, offline sonification of human EEG as a tool for analysis of long recordings | Samples and environmental sounds modulated in pitch, volume and balance | P01 |
| 7 |
| Art installation: an immersive virtual world using interactive sonification | Filtered noise bursts, wide-band signal, subtractive synthesis instruments | P01 |
| 8 |
| Sonification of contour maps | Sampled piano tones | P01 |
| 9 |
| Interactive sonification of rowing for elite and visually impaired athletes, extended to recreational sporting activities | Pure tone, triangular wave tone, MIDI protocol, sampled xylophone tones, piece of music, vocal formant synthesis, bandpass-filtered noise synthesizing the sound of wind, PureData | P03 |
| 10 |
| Sonification of meteorological data (hail storms) | FM instruments, FM synthesis | P01 |
| 11 |
| Sonification of geophysical maps | MIDI synthesizer | P06 |
| 12 |
| Sonification of well logs | Granular synthesis, timbre grains for musical instruments, Geiger counter metaphor | P06 |
| 13 |
| Interactive sonification of: activity in social spaces, motion of a calf, movements of a violin player, and free gestures | MIDI protocol, Max/MSP | P01 |
| 14 |
| Sonification of textured MRI images | Synthesized speech-like sounds | P06 |
| 15 |
| Psychoacoustical study of sonification mapping strategies | Pure tones, FM synthesis | P02 |
| 16 |
| Sonification of geospatial data with uncertainty | MIDI protocol, sampled piano and trumpet tones | P01 |
| 17 |
| Interactive navigation in a virtual space containing auditory targets | Songs | P05 |
| 18 |
| Interactive sonification of running mechanics | Environmental sounds, pre-recorded speech | P07 |
| 19 |
| Spectral mapping for real-time sonification of human EEG | Pure tones, coupled oscillators | P01 |
| 20 |
| Real-time event-based sonification of human EEG | Blip oscillator with vibrato, harmonic tones modulated with a percussive envelope, synthesis from pink noise grains | P01 |
| 21 |
| Kernel regression mapping for real-time sonification of human EEG | Subtractive synthesizer for simple speech-like sounds | P01 |
| 22 |
| Brain-computer interface for paralyzed patients, with real-time multimodal feedback | MIDI protocol, sampled piano tones | P14 |
| 23 |
| Real-time orchestral sonification of EEG, breath, heart beat, ECG with application to artistic performances | MIDI protocol, sampled tones of several instruments | P01 |
| 24 |
| Sonification of maps | Environmental sounds | P01 |
| 25 |
| Sonification of solar wind data | Synthesized wind sound, sampled vocal tones, pure tone, triangular wave tone, sawtooth wave tone, Max/MSP | P02 |
| 26 |
| Interactive sonification of range measured with a laser pointer for mobility aid | MIDI protocol, QTMA software synthesizer, sampled piano tones | P02 |
| 27 |
| Psychoacoustical study of semantic perception of non-musical rhythms and pitch changes in short non-speech sounds for earcon design and sonification of emotional and directional content | MIDI protocol, sampled vibraphone and recorder tones | P02 |
| 28 |
| Real-time sonification of physiological data (pulse oximetry, respiration, blood pressure) for anaesthesia monitoring, hypothetical sonification support for landing multi-engine aircraft | “ | P01 |
| 29 |
| Multi-level interactive sonification of single molecule properties based on force spectroscopy data | Oscillators, IIR filters, pitched tones | P10 |
| 30 |
| Interactive sonification of trunk kinematics for balance improvement | Pure tone | P03 |
| 31 |
| Sonification of knee-joint VAG signal | Pure tone | P14 |
| 32 |
| Model-based sonification for audiovisual composition and performance: evolutionary model of a swarm populated with virtual agents evolving according to a genetic algorithm | Primitive DSP units from SuperCollider (generators and processors). Each agent possesses its own sonic signature. | P01 |
| 33 |
| Sonification of OCT images of human tissue for discrimination of tumor and adipose | FM synthesis | P06 |
| 34 |
| Interactive sonification of violin bowing | Percussive “ | P01 |
| 35 |
| Model-based sonification of high-dimensional data sets: particle trajectories moving in a data potential | Additive synthesis | P12 |
| 36 |
| Model-based sonification of high-dimensional data sets: data solid constituted by point masses anchored via springs, thus vibrating, dynamically evolving following a growing neural gas algorithm | Additive synthesis, damped linear oscillators | P01 |
| 37 |
| Model-based sonification of high-dimensional data sets using principal curve as time-trajectory | Time-variant oscillators, Geiger counter metaphor: ticking sound synthesized by an exponentially decreasing sine wave | P01 |
| 38 |
| Model-based sonification of high-dimensional data sets based on a crystallization process of points in a Euclidian vector space | Additive synthesis, time-variant sine oscillator | P08 |
| 39 |
| Real-time sonification of EMG signals | Six sine oscillators with constant frequency carrier between 200 and 1600 Hz, set in harmonic relationship and modulated in amplitude | P01 |
| 40 |
| Auditory display of complex graphical objects, e.g. color images, based on speech annotations and sonification of colors | Not specified: “ | P21 |
| 41 |
| Interactive sonification of segmented two-dimensional images for the purpose of gaining spatial awareness of the image structure | VST instruments (abandoned), FM synthesis, AM synthesis, filters, square wave tone | P01 |
| 42 |
| Interactive audiovisual biofeedback system for stroke rehabilitation | Musical sounds, MIDI protocol, sampler, Max/MSP | P01 |
| 43 |
| Mobility aid for the blind using ultrasonic echolocation, first monaural (Sonic Torch) then binaural (Sonic Glasses), enabling the detection of objects and their perceptualization within a range of 0 to 6 m | Echo of an ultrasonic sinusoidal wave modulated in frequency by a sawtooth wave, heterodyned with the original signal and rescaled to audible frequency (resulting in practice in a pitched tone) interrupted by short silences (10% of the time). This method was first implemented with pulses instead of frequency modulation, and was found to be much less successful. | P05 |
| 44 |
| Device converting words on a computer to phonemes via brightness level in order to translate contemporary news headlines to dadaist poems | Phonemes from SpeakJet synthesizer IC | P23 |
| 45 |
| Interactive sonification of three physiological quantities for artistic performance (“ | Max/MSP, OSC protocol, filter-enhanced audification, unvoiced subtractive synthesis (filtered noise, flange effect), voiced subtractive synthesis (sawtooth band-limited signal) | P11 |
| 46 |
| Interactive sonification of facial movements and expressions | PureData, cosine wave oscillator, sweeping filter, sampled sounds, additive synthesis | P07 |
| 47 |
| Sonification utility for molecules (SUMO) illustrated by the case of amino acids and B-factors | SuperCollider, OSC protocol, resonant filters (Klank, Formlet) implementing earcons | P01 |
| 48 |
| Interactive sonification of diverse information related to a mobile device (smartphone), including a model of virtual balls anchored via springs, bouncing inside a box | Sample banks of different impact sounds between various materials | P01 |
| 49 |
| Interactive sonification of CFD simulations for computational steering | Max/MSP, filtered white and pink noise to simulate the sound of wind, combinations of pure tones | P01 |
| 50 |
| iSonic: a tool for interactive sonification of georeferenced data on choropleth maps for visually impaired users | Java MIDI sounds: instruments (strings, piano) playing scales, spatial sound server for use of generic HRTF, prerecorded samples | P01 |
| 51 |
| Navigation assistance system on a PDA for visually impaired | Not specified for the sonification part | P05 |
| 52 |
| Six methods for interactive sonification of water pressure changes in the context of crawl stroke swimming | Five sine oscillators with nine semitones of pitch range, white noise modulated in amplitude fed into subtractive synthesis, formant filter synthesis, additive synthesis controlled by a low-frequency pulse | P01 |
| 53 |
| Interactive sonification of radial direction | MIDI software synthesizer | P06 |
| 54 |
| Interactive sonification of the synchronization of two subjects performing hand gestures with mobile devices, three different sonification methods | Multitrack music audio file, Moog ladder filter, MIDI music file, Permorfer (program for real-time modification of music performance) | P08 |
| 55 |
| Sonification of star brightness data with focus on aesthetics for use in music | MIDI protocol, audified data modulated in amplitude by other data | P23 |
| 56 |
| Interactive sonification of navigation through geocontextual maps for visually impaired people using mobile devices | PureData, processed pre-recorded human speech | P01 |
| 57 |
| Sonification of video data showing the movements of a worm | Max/MSP, granular synthesis, short sine tone wavelets grains | P01 |
| 58 |
| The Cosmophone: art installation performing real-time sonification of the trajectory of cosmic particles (muons) | Two arrays of loudspeakers (below, above) for spatial sound, MIDI protocol, MIDI synthesizer, Max/MSP, samples of rain drops, piano tones and scattered words from a poem | P01 |
| 59 |
| The Sonified Urban Masterplan (SUM): a tool for sonification of urban maps, evolved to a tool for computer-aided composition of graphic scores via path-based image sonification | PWGL (Lisp-based visual programming environment for music), MIDI protocol, OSC protocol, possibility to interface in Lisp, Max/MSP and PureData | P01 |
| 60 |
| Interactive sonification of body movements (tilt of hips and torso) using the wearable interface hipDisk | Twelve simplistic tones generated by a Basic Stomp2 microcontroller forming a one-octave chromatic, pentatonic, major or minor scale | P06 |
Description of the projects analyzed in this study including their corresponding list of publications, sonic material used, and identified mappings between physical (P) and auditory (A) dimensions. Mappings are referenced using the labels defined in Tables 1 and 2. Mapping labels (G, B, F) described in Section 3.2.2 are displayed over the mapping arrows. Multiple occurrences of the same mapping identified within the same project are indicated in parenthesis following the concerned mapping reference. Abbreviations used in this table are listed in Table 4.
List of abbreviations used in Table 3.
| AM | Amplitude Modulation |
| CFD | Computational Fluid Dynamics |
| DSP | Digital Sound Processing |
| ECG | Electrocardiography |
| EEG | Electroencephalography |
| EMG | Electromyography |
| FM | Frequency Modulation |
| HRTF | Head-Related Transfer Function |
| IC | Integrated Circuit |
| IIR | Infinite Impulse Response |
| MIDI | Musical Instruments Digital Interface |
| MRI | Magnetic Resonance Imaging |
| OCT | Optical Coherence Tomography |
| OSC | Open Sound Control |
| PDA | Personal Digital Assistant |
| QTMA | QuickTime Music Architecture |
| VAG | Vibroarthrographic |
| VST | Virtual Studio Technology |
List of abbreviations used in Table 3.
Most used mappings within the sixty projects analyzed.
| Number of occurrences | Mapping | Reference |
| 24 | Location | P01 |
| 18 | Location | P01 |
| 12 | Distance | P05 |
| 10 | Density | P16 |
| 9 | Distance | P05 |
| 8 | Density | P16 |
| 7 | Orientation | P06 |
| 7 | Size | P30 |
| 6 | Velocity | P02 |
| 6 | Motion | P07 |
| 6 | Motion | P07 |
| 6 | Energy | P08 |
| 6 | Signal amplitude | P14 |
| 6 | Signal spectral energy distribution | P28 |
The fourteen most used mappings within the sixty projects analyzed. More than half of these mappings involve Pitch (A01). All the other (i.e. not involving Pitch) mappings but one correspond to natural perceptual associations.
Use of auditory dimensions regardless of the sonified physical dimensions.
| Auditory dimension | Percentage of the total number of mappings | Number of auditory dimensions used significantly less often |
| Pitch | 23.8 | 29 (100%) |
| Loudness | 15.2 | 27 |
| Duration | 10.1 | 25 |
| Spatialization | 9.5 | 25 |
| Tempo | 5.9 | 21 |
| Brightness | 5.1 | 18 |
| Timbre | 3.6 | 14 |
| Instrumentation | 3.6 | 14 |
| Spectral power | 2.8 | 9 |
| Spectral duration | 2.4 | 5 |
| Pitch range | 2.0 | 3 |
| Center frequency | 2.0 | 3 |
| of filter |
Most often used auditory dimensions regardless of the sonified physical dimensions. The second column corresponds to the percentage of the total number of mapping occurrences () involving this auditory dimension. The third column indicates the number of auditory dimensions used significantly less often (). Pitch (A01) was found to be used significantly more often than all other 29 dimensions (A02 to A30) presented in Table 2.
High-level trends in the distribution of mapping occurrences.
| Pitch-related | Loudness-related | Temporal | Timbral | Spatial | ||||||
|
|
|
|
|
|
|
|
|
|
| |
|
| 64 | 26.8 | 30 | 12.6 | 53 | 22.2 | 45 | 18.8 | 47 | 19.7 |
|
| 19 | 22.4 | 24 | 28.2 | 19 | 22.4 | 19 | 22.4 | 4 | 4.7 |
|
| 22 | 29.3 | 11 | 14.7 | 19 | 25.3 | 23 | 30.7 | 0 | 0.0 |
|
| 17 | 25.0 | 8 | 11.8 | 17 | 25.0 | 24 | 35.3 | 2 | 2.9 |
|
| 20 | 32.3 | 6 | 9.7 | 15 | 24.2 | 18 | 29.0 | 3 | 4.8 |
Distribution of mapping occurrences aggregated in high-level categories for both physical and auditory domains. The number of mapping occurrences identified is reported () together with the corresponding proportion normalized against the high-level categories in the physical domain ().
Figure 2Proportions of mapping occurrences normalized against high-level categories in the physical domain.
It can be observed that Loudness-related auditory dimensions are used mainly to sonify physical quantities belonging to the high-level category Kinetics. Spatial auditory dimensions are used mainly to sonify physical quantities belonging to the high-level category Kinematics.
Intermediate-level trends in the distribution of mapping occurrences.
| Physical dimension |
|
| Auditory dimension |
| ||
| P01 | Location | 74 | 15 | A17 | Spatialization*(28) | 32.4 |
| A01 | Pitch*(28) | 24.3 | ||||
| A02 | Pitch range | 6.8 | ||||
| A04 | Instrumentation | 5.4 | ||||
| A20 | Duration | 5.4 | ||||
| A21 | Sequential position | 5.4 | ||||
| P05 | Distance | 41 | 14 | A15 | Loudness*(26) | 29.3 |
| A01 | Pitch*(23) | 22.0 | ||||
| A20 | Duration | 9.8 | ||||
| P07 | Motion | 40 | 13 | A01 | Pitch | 15.0 |
| A17 | Spatialization | 15.0 | ||||
| A18 | Doppler effect | 12.5 | ||||
| A15 | Loudness | 10.0 | ||||
| A20 | Duration | 10.0 | ||||
| P16 | Density | 34 | 10 | A01 | Pitch*(23) | 29.4 |
| A20 | Duration | 23.5 | ||||
| A15 | Loudness | 11.8 | ||||
| P06 | Orientation | 26 | 11 | A01 | Pitch | 26.9 |
| A17 | Spatialization | 19.2 | ||||
| P02 | Velocity | 25 | 8 | A01 | Pitch | 24.0 |
| A19 | Tempo | 20.0 | ||||
| A12 | Brightness | 16.0 | ||||
| A15 | Loudness | 16.0 | ||||
| P30 | Size | 24 | 13 | A01 | Pitch | 29.2 |
| A20 | Duration | 16.7 | ||||
| Signal spectral | 23 | 8 | A01 | Pitch | 26.1 | |
| P28 | energy | A08 | Spectral power | 21.7 | ||
| distribution | A15 | Loudness | 21.7 | |||
| P13 | Pressure | 19 | 9 | A01 | Pitch | 26.3 |
| P08 | Energy | 15 | 8 | A15 | Loudness | 40.0 |
| P03 | Acceleration | 14 | 7 | A01 | Pitch | 35.7 |
| A15 | Loudness | 28.6 | ||||
| P26 | Event rate | 14 | 6 | A19 | Tempo | 28.6 |
| P11 | Temperature | 12 | 7 | A01 | Pitch | 41.7 |
| P14 | Signal amplitude | 11 | 4 | A15 | Loudness | 54.5 |
| P27 | Signal frequency | 11 | 5 | A01 | Pitch | 36.4 |
| P23 | Color luminosity | 8 | 4 | A15 | Loudness | 50.0 |
| P17 | Mass | 6 | 3 | A01 | Pitch | 66.7 |
For each intermediate-level physical dimension listed in the first column, the total number of mapping occurrences involving it () is displayed in the second column. The number of intermediate-level auditory dimensions that have been used at least once to sonify this physical dimension () is shown in the third column, followed by the list of auditory dimensions used significantly more than 0% of the time, and by the corresponding proportion of use (). Auditory dimensions are marked with a star (*) whenever they have been found to be used significantly () more often than other auditory dimensions used at least once to sonify the same physical dimension. The star is followed by the total number of auditory dimensions used significantly less often (including those not used at all).
Use of auditory dimensions regardless of the sonified physical dimensions in the case of the multi-class dimension Spatialization.
| Label | Class of spatialization | Proportion |
|
| Stereo panning | 53. |
|
| Multichannel panning | 17.0 |
|
| Non-specified spatialization method | 14.9 |
|
| Interaural amplitude difference | 12.8 |
|
| Head-related transfer function | 10.6 |
|
| Interaural time difference | 10.6 |
|
| Vector base amplitude panning | 6.4 |
|
| Ambisonics | 6.4 |
|
| Interaural frequency difference | 2.1 |
Classes of spatialization ranked according to their proportion of use with respect to the total number of mapping occurrences involving Spatialization (A17). Significantly higher percentages () are indicated with a star (*).
High-level trends in the case of the multi-scale dimension Duration.
| A28 | A201 | A202 | A203 | A204 | |
|
| 19.0 | 66.7* | 9.5 | 0.0 | 4.8 |
|
| 12.5 | 62.5 | 12.5 | 0.0 | 12.5 |
|
| 26.7 | 33.3 | 26.7 | 6.7 | 6.7 |
|
| 11.1 | 55.6 | 11.1 | 0.0 | 22.2 |
|
| 22.2 | 44.4 | 22.2 | 0.0 | 11.1 |
|
| 19.4 | 53.2* | 16.1 | 1.6 | 9.7 |
Proportions of mapping occurrences for each scale are shown aggregated in high-level categories in the physical domain, as well as regardless of the physical dimension (Total). The scales considered are: Spectral duration (A28), Rhythmic duration (A201), Event duration (A202), Ambient duration (A203) and Non-specified duration scale (A204). Significantly higher percentages () within a row are indicated with a star ().
Intermediate-level trends in the case of the keyword-based categories Horizontal and Vertical.
| Physical dimension |
|
| Auditory dimension used significantly more than 0% of the time |
|
| Horizontal | 22 | 14 | Spatialization*(27) | 36.4 |
| Vertical | 30 | 9 | Pitch*(28) | 46.7 |
| Loudness | 13.3 |
For each keyword-based category, the total number of mapping occurrences () is displayed, followed by the number of intermediate-level auditory dimensions that have been used at least once to sonify this category (). The list of auditory dimensions used significantly more than 0% of the time is shown in the third column, followed by the corresponding proportion of use (). Auditory dimensions are marked with a star (*) whenever they have been found to be used significantly () more often than other auditory dimensions used at least once to sonify the same keyword-based category. The star is followed by the total number of auditory dimensions used significantly less often (including those not used at all).
Project-related trends: use of auditory dimensions regardless of the sonified physical dimensions.
| Auditory dimension | Percentage of projects using the dimension at least once | Number of auditory dimensions used by significantly fewer projects |
| Pitch | 86.7 | 28 |
| Loudness | 73.3 | 27 |
| Spatialization | 51.7 | 26 |
| Duration | 40.0 | 24 |
| Brightness | 23.3 | 13 |
| Timbre | 20.0 | 10 |
| Tempo | 16.7 | 6 |
| Spectral power | 15.0 | 6 |
Percentage of projects using specific auditory dimensions at least once. The eight auditory dimensions used by the largest number of projects are displayed. The third column indicates the number of other auditory dimensions used significantly less often ().
Figure 3Historical distribution of sonification works according to the year of publication.
The red curve corresponds to the publications considered for the present systematic review. The black curve corresponds to the works included in the publication database, including those considered for the present systematic review.
Figure 4Distribution of the projects considered in the present systematic review classified according to their primary function.
Figure 5Sonic material used in the projects considered in the present systematic review.
Results are presented in groups corresponding to level of synthesis, general category of sound, standard protocols, and software.