Literature DB >> 29944144

Emotional ratings and skin conductance response to visual, auditory and haptic stimuli.

Elia Gatti1, Elena Calzolari2, Emanuela Maggioni1, Marianna Obrist1.   

Abstract

The human emotional reactions to stimuli delivered by different sensory modalities is a topic of interest for many disciplines, from Human-Computer-Interaction to cognitive sciences. Different databases of stimuli eliciting emotional reaction are available, tested on a high number of participants. Interestingly, stimuli within one database are always of the same type. In other words, to date, no data was obtained and compared from distinct types of emotion-eliciting stimuli from the same participant. This makes it difficult to use different databases within the same experiment, limiting the complexity of experiments investigating emotional reactions. Moreover, whereas the stimuli and the participants' rating to the stimuli are available, physiological reactions of participants to the emotional stimuli are often recorded but not shared. Here, we test stimuli delivered either through a visual, auditory, or haptic modality in a within participant experimental design. We provide the results of our study in the form of a MATLAB structure including basic demographics on the participants, the participant's self-assessment of his/her emotional state, and his/her physiological reactions (i.e., skin conductance).

Entities:  

Mesh:

Year:  2018        PMID: 29944144      PMCID: PMC6018518          DOI: 10.1038/sdata.2018.120

Source DB:  PubMed          Journal:  Sci Data        ISSN: 2052-4463            Impact factor:   6.444


Background & Summary

The study of human emotions is a fascinating and cross-disciplinary field of research. In the past 20 years, the interest on human emotions has been extended, from the realm of psychology, to other disciplines such us neuroscience[1], product and experience design[2], and computer science [3,4]. Despite different theories of emotions have been proposed over the years[5,6], there seems to be the common understanding that emotional states are characterized by physiological and cognitive responses to clearly identifiable stimuli[7]. As such, whether we investigate emotions to understand the human mind or to teach an automated system how to be more “humane”, emotional investigation is, in the great majority of cases based on: (i) delivering emotional stimulation and (ii) measuring cognitive and physiological reactions. With emotional stimulation, we intend an external stimulus that is able to elicit an emotional reaction. In literature, the most common way to elicit emotions is by delivering visual stimuli[8]. The International Affective Picture system (IAPS)[9] and the Geneva Affective Pictures Database (GAPD)[10], are examples of collection of visual stimuli (pictures) often used to elicit emotions. The Affective Norms for English Words (ANEW)[11] and the Affective Norms for English Texts (ANET)[12] are still visual-based (although not pictorial) collections of emotion-eliciting stimuli. Auditory stimuli are often used as an alternative to visual stimuli (e.g., the International Affective Digital Sounds, IADS)[13]. While the IADS includes mainly short audio clips, long segments of musical pieces have also been used to elicit emotional reactions[14,15]. It is important to note that, while IADS stimuli always have a well-defined semantic (indeed, it is not hard to imagine the source of the IADS stimuli when listening to them), classical music pieces differ from these in the sense that their emotional value is to be found within the features of the composition itself, such as its tempo and tonality[16,17]. Audio-visual stimulation has also been used to elicit emotions. A number of studies has been performed using short movies as emotional stimuli[18] and a few standardized databases of emotional films and clips are now available[19,20]. Alongside the more common audio-visual stimuli, more non-conventional and less standardized stimuli have been used in literature, such as olfactory[21,22] and haptic[23-25] stimuli. Whatever stimulus is used to elicit an emotion, this is supposed to impact on participants’ cognitive and physiological state. To measure the impact of an emotional stimulus, a number of self-assessment questionnaires have been used in literature. Notable examples are the Self-Assessment Manikin (SAM)[26] and the Geneva Emotion Wheel (GEW)[27]. Together with self-assessment measures, participants’ physiological and bodily reactions to emotional stimuli are often recorded[28]. In neuroscience, such exploration is often performed through brain imaging[29]. Within computer science, where gaining access to brain activations might be cumbersome, attention is often directed towards autonomic system responses. This approach, although rather generic, allows to record bodily variations based on emotional stimulation using relatively little hardware (measuring tools)[30,31]. Such responses include the recording of parameters related with the human vascular system (blood volume pulse, heart rate), participants’ reaction times to startle reflexes, variation in skin and body temperature, and variation in the skin conductance (SC) of light electric current. The latter measure directly correlates with autonomic sudomotor nerve activation, and is therefore an indirect measure of palm sweating, in turn related with the increase in the arousal of a subject. SC is arguably the most used physiological parameter to investigate participants’ emotional activation[32,33]. Here, we describe a database (Data Citation 1) reporting SC activations and SAM self-assessment from 100 participants, to a variety of emotional stimuli. The emotional stimuli are communicated through three different sensory modalities, namely: 20 audio stimuli, 20 visual stimuli, and 10 haptic stimuli. Of the 20 audio stimuli, 10 were selected from the IADS, and the other 10 stimuli were instrumental musical pieces that were never tested before. Similarly, 10 visual stimuli came from the IAPS, while the other 10 were abstract works of art. The 10 haptic stimuli were obtained from previous work on using mid-air haptic stimulation for eliciting emotions[25]. Noteworthy, the choice of including abstract visual stimuli was motivated by the fact that, in both the haptic and audio modality, it is possible to elicit emotions by stimuli with no obvious semantics[34]. We therefore tested whether this holds true for the selected stimuli in the visual modality. This database is the first database that allows to compare directly SAM ratings for given stimuli to SC responses in 3 different sensory modalities. Far from being a comprehensive database of emotional stimuli, nevertheless it offers the possibility to compare ratings and physiological activation within subjects for stimuli in 3 different sensory modalities. In addition, it validates 30 stimuli, which do not have immediate meaning for the participant, for the first time assessing emotional “abstract” stimulation on a large number of participants.

Methods

Participants

One hundred, healthy volunteers participated in the experiment (mean=26.88 years; SD=9.11; range: from 18 to 71 years; 61 females). All but 9 participants declared to be right-handed. Pre-screening allowed only participants with normal (or corrected to normal) vision, with no history of neurological, psychological, or psychiatric disorders, and no tactile or auditory impairments to take part in the experiment. Participants were recruited from the University of Sussex, were naïve as to the purpose of the experiment, were paid for their participation, and gave written informed consent. The study was conducted in accordance with the principles of the Declaration of Helsinki and was approved by the University of Sussex ethics committee (Sciences & Technology Cross-Schools Research Ethics Committee, Ref: ER/EG296/1).

Acquisition setup and procedure

Participants were invited to sit comfortably in front of a computer screen. We placed SC recording electrodes (GSR Electrodes, Shimmer®) to the index and ring finger of participants’ left hand (as in ref. 35) as shown in Fig. 1.
Figure 1

Experimental design and protocol.

(a) Schematic representation (left) and picture (right) of the box containing the mid-air haptic device. (b) Schematic representation (left) and picture (right) of the experimental setup. (c) Schematic representation of the experimental procedure. At first the participant is asked to relax for about 5 min; followed by a 60-seconds break before the start of the experiment; a three seconds countdown precedes the stimulus; the stimulus is displayed; SAM questions are presented one after another in a randomized order; after answering to the questions the countdown for the next stimulus starts. The whole procedure is repeated through the experiment.

A variable amount of time (approximately five minutes) was allowed to participants to relax, get ready for the experiment, and familiarize themselves with the experimental setup. Particularly, within this time, participants were first shown by the experimenter how to place correctly their right hand over the haptic device, and then asked to repeatedly put their right hand on it, memorizing the position of the hand so that they could replicate it throughout the experiment. Moreover, participants were invited to find a comfortable position for the left hand, which they were instructed not to move for the duration of the experiment. At the beginning of the experiment, participants were asked again to relax for a duration of 60 s, for the SC signal to reach baseline. At the start of the experiment, SC recording was triggered, and kept running until the end of the experiment. The delivery of each emotional stimulus was marked as emotional trigger in the data log to help us interpret the meaning of the SC signal. Emotional stimuli were presented in a randomized order. Particularly, stimuli presentation was completely randomized, rather than block randomized, to avoid any sensory-modality-related scale bias (i.e., participants adopting different scales depending on the sensory modality). Before each stimulus, a three seconds countdown appeared in the centre of the screen, followed by the stimulus. When a haptic or auditory stimulus was presented, a sentence was simultaneously displayed on the screen (either “playing audio” or “playing haptics”), informing participants on the sensory modality of the upcoming stimulus. After each stimulus, participants were asked to rate the stimulus using the original version of the SAM[26] (see Self-Assessment rating, below), therefore rating the stimuli according to their arousal, valence, and dominance[26], with the right hand. Each dimension of the SAM was presented in a randomized order, one after another. After answering to the last of the SAM questions, a new countdown started, marking the beginning of a new trial. At the beginning of each countdown, participants positioned their right hand comfortably on a black Plexiglas box containing a mid-air haptic device (Ultrahaptics®), as they had previously learned during the familiarization. The box was open on the upper side, and a soft support for the wrist was attached on the edge close to the participant, so that participants could comfortably place their right hand over the aperture, with the palm completely exposed to the mid-air haptic device at a 20cm distance (Fig. 1). Noteworthy, the design of the box allowed participants to easily position their hand above the device in a standardized manner. Moreover, the experimenter assisted the participant throughout the experiment and repeated the trials in case of wrong positioning of the hand.

Stimuli

Auditory stimuli

The auditory stimuli comprise ten sounds from the IADS database (see Table 1), and ten instrumental extracts from various compositions (see Table 2). The ten sounds from the IADS were selected according to their SAM scores on valence and arousal reported in previous works[13]. IADS sounds were selected so that two sounds had high arousal and high valence ratings, two had low arousal and low valence ratings, two had low arousal and high valence ratings, two had high arousal and low valence ratings, and two were defined as neutral (low arousal and mid-valence ratings). The ten instrumental sounds were rated for the first time in this work, and constitute an original contribution to the state of the art. Musical pieces were considered an “abstract” form of stimulation in our study. In fact, instrumental pieces do not convey immediate meaning to the listeners, and the emotional content of the piece is instead related to features within the composition (e.g., tonality, tempo, etc.)[16,17]. The inclusion of classical music pieces in our database was motivated by the recent interest in the link between musical pieces and emotional reactions[14-17]. All the auditory stimuli were presented to participants by means of a pair of headphones (Beat Studio, Monster); auditory stimuli volume did not surpass the 90 dB limit (IADS #275, scream). The duration of all the IADS auditory stimuli selected was 6 s, apart from one lasting 5 s (see Table 1). The duration of the abstract-classic auditory stimuli could vary according to the musical sentence (see Table 1).
Table 1

Multisensory stimuli used in the experiment.

Stimulus numberStimulus typeStimulus nameStimulus authorDisplay duration
1AUDIO CLASSICPiano Concerto No. 20 in D minor, K466W.A. Mozart13
2AUDIO CLASSICOlympic fanfare and themeJ. Williams19
3AUDIO CLASSICOctoberE. Whitacre38
4AUDIO CLASSICShenandoahF. Ticheli45
5AUDIO CLASSICDanse macabreC. Saint-Saëns13
6AUDIO CLASSICNight on the bald mountainM.P. Mussorgsky30
7AUDIO CLASSICSweet deathJ.S. Bach41
8AUDIO CLASSICAdagio for StringsS. Barber30
9AUDIO CLASSICUndertale Extended – Undertale OSTT. Fox39
10AUDIO CLASSICMegalovania Extended – Undertale OSTT.Fox30
11AUDIO IADS#110 baby laughingNA6
12AUDIO IADS#115 beesNA6
13AUDIO IADS#172 waterNA6
14AUDIO IADS#230 women laughingNA6
15AUDIO IADS#255 vomitingNA6
16AUDIO IADS#260 babies cryingNA6
17AUDIO IADS#275 screamNA5
18AUDIO IADS#320 typewriterNA6
19AUDIO IADS#351 clappingNA6
20AUDIO IADS#364 crowd voices partyNA6
21PICTURES ABSTRACTLa vagueH. Matisse15
22PICTURES ABSTRACTBlack SquareK. S. Malevich15
23PICTURES ABSTRACTRythme n1R. Delaunay15
24PICTURES ABSTRACTLines areas depth IIIF. Kupka15
25PICTURES ABSTRACT1866-1944 Ohne TitelV. V. Kandinsky15
26PICTURES ABSTRACTMondrianP. Mondrian15
27PICTURES ABSTRACTBased on Leaf Forms and SpacesDove15
28PICTURES ABSTRACTFire in the EveningP. Klee15
29PICTURES ABSTRACTUdnieF. Picabia15
30PICTURES ABSTRACTYellow Red BlueV. V. Kandinsky15
31PICTURES IAPS#1274 beetlesNA15
32PICTURES IAPS#1525 dog growlingNA15
33PICTURES IAPS#2045 baby smiling1NA15
34PICTURES IAPS#2070 baby smiling2(female)NA15
35PICTURES IAPS#2900 kid cryingNA15
36PICTURES IAPS#3550 man in bloodNA15
37PICTURES IAPS#5829 sunsetNA15
38PICTURES IAPS#7009 empty mugNA15
39PICTURES IAPS#7238 blue and yellow spheresNA15
40PICTURES IAPS#8470 athlete exultsNA15
41HAPTIC[25]P1NA1.4
42HAPTIC[25]P2NA2.03
43HAPTIC[23]P3NA1.3
44HAPTIC[25]P4NA1.2
45HAPTIC[25]P5NA1.4
46HAPTIC[25]P6NA1.2
47HAPTIC[25]P7NA1.2
48HAPTIC[25]P8NA1.3
49HAPTIC[25]P9NA1.2
50HAPTIC[25]P10NA1.4
Table 2

SC signal’s features, available at.SC.features

Variable (Labels used in exported files)Description
 Event data
 
 Event.nrSequence number of event/marker
 Event.nidNumerical ID of event
 event.nameOptional name or description of event
 Event.udOptional user data associated with event
Continuous Decomposition Analysis (CDA) (Extraction of Continuous Phasic/Tonic Activity based on Standard Deconvolution)
 
 CDA.nSCRNumber of significanta SCRs (Skin Conductance Responses)
 CDA.LatencyResponse latency of first significanta SCR within response window b[s].
 CDA.AmpSumSum of significanta SCR-amplitudes of SCRs within response windowb. (reconvolved from corresponding phasic driver-peaks) [μS]
 CDA.SCRAverage phasic driver within response windowb. This score represents phasic activity within response windowb most accurately, but does not fall back on classic SCR amplitudes [μS]
 CDA.ISCRArea (i.e. time integral) of phasic driver within response window. It equals SCR multiplied by size of response window b[μSbs]
 CDA.PhasicMaxMaximum value of phasic activity within response windowb [μS]
 CDA.TonicMean tonic activity within response windowb (of decomposed tonic component)
Standard trough-to-peak (TTP) or min-max analysis
 
 TTP.nSCRNumber of significanta SCRs, within response windowb
 TTP.AmpSumSum of SCR-amplitudes of significanta SCRs within response windowb [μS]
 TTP.LatencyResponse latency of first significanta SCR within response windowb [s]
Global Measures
 
 Global.MeanMean SC value within response windowb (within response window)
 Global.MaxDeflectionMaximum positive deflection within response windowb

asignificant=which amplitude surpasses the threshold of 0.01 μS[41]

bwithin response window=the response window considered for each stimulus was different, and was equal to the whole duration of the stimulus, plus additional 300 ms to account for the relatively slow dynamics of SC response.

Visual stimuli

The twenty visual stimuli comprise 10 pictures from the IAPS database, and ten abstract pictures (see Table 1). The ten pictures from the IAPS were selected according to their SAM scores on valence and arousal reported in previous works[3]. In particular, two pictures were selected having high arousal and high valence ratings, two having low arousal and low valence ratings, two having low arousal and high valence ratings, and two having high arousal and low valence ratings, and two were pictures defined as neutral (low arousal and mid-valence ratings). The ten abstract pictures were arbitrarily selected from the work of renowned artists, thus their arousal, valence and dominance scores were assessed for the first time in the current work. The choice of abstract art as emotional stimulus was inspired by the body of literature on emotions, art, and aesthetic[8,36]. By including works of abstract art in our experiment, we test emotional visual stimuli that, as for the instrumental pieces in the acoustic realm, do not relate to an obvious meaning. Data reported herby might serve to further explore the connection between aesthetic and emotional reactions. All the visual stimuli were presented in the centre of a monitor screen (26 inches), placed at about 40 cm distance from participants, with the centre aligned at participants’ eye level. The visual stimuli were displayed for 15 s.

Haptic stimuli

Ten haptic stimuli were delivered by means of a mid-air haptic device developed by Ultrahaptics Ltd (http://ultrahaptics.com/). This mid-air haptic technology allows the creation of tactile sensations through focused ultrasound waves, resulting in sensations that can be described as dry rain, puffs of air[37]. The haptic device comprises an array of ultrasonic 16×16 transducers; each transducer can be activated individually, in sequence, or simultaneously with other transducers, thus creating unique patterns varying in location, intensity, frequency, and duration[38].The ten different haptic stimuli used in the present experiment are presented in form of haptic patterns. These patterns were selected from a list of haptic patterns created and previously validated by Obrist and colleagues[25]. These patterns vary in location (16 different locations specified in a 4×4 grid) on the users’ palm, intensity (three level: low – medium – high), frequency (five options, range: 16–256 Hz), and duration (200–600 ms). Please note that such patterns were designed for the right hand of the user, and therefore were delivered to the right hand of the participant, without considering the hand dominance.

Self-Assessment rating

Participants ratings of their own emotional reaction were recorded using the Self-Assessment Manikin (SAM) (see Fig. 2), a non-verbal pictorial assessment technique evaluating emotional reaction along three components: Valence (whether the elicited emotion is positive or negative), Arousal (how much the elicited emotion is “activating”), and Dominance (if the participant feels “in control” of the emotion), which are often identified as the main descriptors of all emotional activations[26]. The SAM was first proposed by Bradley and Lang [24], and reflects the idea that emotions as we know them (i.e., fear, joy, anger, calm, etc.) can be represented on a two-dimensional space that has Valence and Arousal as main orthogonal axes. Despite discussing the different theories of emotions is beyond the scope of this paper, it is worth considering the advantage of using the SAM approach as compared to a questionnaire reflecting a categorical model of emotions (i.e., Geneva Emotions Wheel). In fact, compared to a categorical approach of emotions[27], this approach allowed us to bypass the semantic implication and idiosyncrasies that participants could have had in naming the emotion they were feeling[39], focusing instead on the assessment of their own emotional state. Rating scales were displayed on the computer screen. Below each rating scale a horizontal bar of the same length of the five SAM’s pictorial representations was presented with a cursor at the centre of the bar. Participants had the five different manikins as a visual reference for each emotional dimension (i.e., arousal, valence, and dominance). Participants could regulate the cursor position over the bar by means of a mouse manoeuvred with their right hand. The rating scales range from 0 to 100, in 1 point steps, where 0 corresponds to the extreme left manikin, and 100 to the extreme right manikin[26]. A continuous visual analogue scale was used to account for more accuracy in the parametric data-analysis and more sensitiveness to change in the assessment[40]. No time limit was given to the participants to answer the SAM.
Figure 2

The Self-Assessment Manikin (SAM).

SAM scale used in the experiment to capture participants’ emotional reactions on three dimensions: (from the top row to the bottom row) the manikin representations to express values of Valence (top), Arousal (mid), and Dominance (bottom).

Skin conductance recording and features extraction

Skin conductance (SC) response was measured with a Shimmer3 GSR+ Unit wireless device (Shimmer Sensing, Dublin). Two 8mm snap style finger TYPE (such as: Ag–AgCl) electrodes (GSR electrodes, Shimmer Sensing) with a constant voltage (0.5 V) were attached to participants’ intermediate phalanges of their left index and ring fingers. The SC recording device was connected wireless to a PC to digitalize data through the ShimmerCapture software. The gain parameter was set at 10 mSiemens (μS)/ Volt, the A/D resolution was 12 bit, allowing to record responses ranging from 2 to 100 μS. Each recording was analysed using the MATLAB Ledalab toolbox. As first step data were downsampled and cleaned from artefacts (see Technical validation). Feature extraction was obtained via continuous deconvolution analysis[41] (CDA). CDA divides the SC signal in a phasic and a tonic component, making it easier to extract features related to determinate events (triggers). Event related features obtained with CDA are shown in Table 2, which has been adapted from the original table presented at www.ledalab.de. Please note that while the trigger eliciting the event-related features was set at the beginning of the stimulus, the time window considered to compute event related features encompassed the whole duration of the stimulus, plus 4 s (for a discussion on the duration of this time window please refer to Usage Notes and Limitations of the database, below). Standard trough-to-peak (TTP) features were also obtained from the raw signal, as well as global measures (mean and maximum deflection, see Table 2). In general, features related to SC response have been related in multiple occasions to emotional responses (and particularly, high arousal responses)[31-33]. Emotional states characterized by high arousal correlate with the activation of a fight or fly response. Such response is regulated by the autonomic system (and in particular by the sympathetic system), which in turn activates the sudomotor nerve, triggering the release of sweat from the sweat glands in the hand’s skin. An example of SC signal is shown in Fig. 3.
Figure 3

Example of a Skin Conductance (SC) recording as displayed from the analysis software Ledalab.

The blue area indicates the phasic component of the signal, while the grey area represents the tonic component. The red line indicates the trigger (moment of delivery of the stimulus).

Data Records

SC recordings and self-assessment ratings collected during the experiment are organized in a single MATLAB data structure available at (Data Citation 1). Such data structure also includes demographic information on each participant, as well as baseline SC recordings collected prior the experiment. Quality of the SC recording is also available as further structure field (see Technical Validation). Moreover, the “features” field of the structures shows SC features extracted by using Continuous Decomposition Analysis (CDA) with the MATLAB toolbox Ledalab.

multisensory_emotions.m

Hereafter is an explanation of all the fields of the MATLAB data structure multisensory_emotions.m are explained. Matlab structures allow to organize the data hierarchically, the hierarchical design of the structure is shown in Fig. 4.
Figure 4

Graphical representation of the MATLAB structure containing all data from each participant.

multisensory_emotions.stim=50X6 MATLAB array. This field allows to retrieve information about the stimuli that were delivered to participants during the experiment. multisensory_emotions.stim allows users to access a MATLAB array which columns represent respectively: the stimulus identification number, the modality of delivery (either audio, video, or haptic), the name of the stimulus (IAPS/IADS identification number, artwork title, etc.), the duration of the stimulus/presentation time, the author/composer (when known), and finally the database on which the stimulus was tested (in case of further expansion of the present database by other parties in future studies, see below: Usage Notes and Limitations of the database). multisensory_emotions.su=The.su field allows to retrieve information about one particular participant. The.su field is the basic layer of the structure. The number of items at this level of the structure is 100, same as the number of participants taking part to the experiment. The.su field is, in turn, divided into different sub-fields: .su.demographic =1X4 MATLAB vector. Accessing to the.demographic field allows to retrieve demographic information on one particular participant. The vector contains, respectively: the participant’s ID number, the gender of the participant, the hand dominance, and the age of the participant. .su.SAM=MATLAB array 50X5. Accessing the.SAM field allows to retrieve information about the ratings of each stimulus, as well as about the order of the stimuli presented to the participant..array columns are, respectively, ratings on (1) valence, (2) arousal, and (3) dominance. Column number 4 represents the stimulus identification number. It is possible look at the type of stimulus by matching the identification number with the stimulus identification number in multisensory_emotions.stim. Please note that the knowledge of what stimulus was delivered to the participant in any given moment is a crucial information to correctly interpret the SC signals. Column number 5 marks instead the presentation order of the stimuli. .su.SC=the.SC field allows users to access skin conductance data collected during the experiment. SC data is in turn divided into 7 sub fields: .SC.raw: nX2 MATLAB array containing the n samples collected throughout the whole experiment and 2 columns. The first column represents skin conductance value (μS) collected with the Shimmer3 GSR+ Unit. The second column represents the presentation of the stimulus as SC trigger. The moment of the stimulus presentation is marked as 1, while other samples are simply 0 s. .SC.raw_all: nX2 MATLAB array containing the n samples collected throughout the whole experiment and 2 columns. The first column represents skin conductance value (μS) collected with the Shimmer3 GSR+ Unit. The second column represents the presentation of the stimulus as SC trigger. The moment of the stimulus presentation is marked as 1 and the presentation of each SAM question on arousal is marked as 4, on valence as 5, on dominance as 6. .SC.clean: nx2 MATLAB matrix containing the n samples collected throughout the whole experiment and 2 columns. The data in.SC.clean have been downsampled with a factor 4 to allow for a faster reading and elaboration of the SC trace. Moreover, data in this sub-field have been cleaned using the Ledalab artefact correction toolbox (see Technical Validation below). .SC.baseline: 1xn MATLAB vector, including the SC recordings from the participant prior the beginning of the experiment. .SC.quality: single value from 0 to 2, whereas 0 represents unusable data, 1 represents partially complete recording, and 2 represents complete recordings. .SC.features: 50X12 MATLAB array. The columns represent the 12 features extracted using the Ledalab toolbox for MATLAB (see Technical Validation for more information on the features extracted). The rows represent the stimuli arranged by “stimulus number” (therefore not presentation order). Missing values are reported as NA in the MATLAB table. Each of the 12 features is reported and explained in Table 2. .SC.responsiveness: single value, either 0 or 1. It is meant to serve as an indication of whether the participant showed any phasic response throughout the experiment. The absence of phasic responses for all the stimuli (.SCrespnsiveness=0) mark the participant as a possible non-responder. With non-responder is meant an individual with low skin conductance, on which SC recording do not evidence emotional responses even when they are present (as shown by SAM ratings). Together with the MATLAB structure, we also provide an R list (multisensory_emotions_R.rda, Data Citation 1). The R list architecture is equivalent to the one shown in Fig. 4. R is an open source software downloadable at https://cran.r-project.org/ and hence enables anyone interested in our dataset to access and use it for future studies.

Technical Validation

A post-doctoral level researcher with over 5 years of experience in the field of behavioural research acquired data from all the 100 participants. Participants were invited to take a comfortable position at the beginning of the experiment, and asked not to move their left hand during the whole experiment to avoid movement artefacts in the SC signal.

Participants reliability and SAM ratings

While for the SC signal, it is possible to assess the quality of the data by checking the number of obtained samples and artefacts, it can be harder to determine whether self-assessment questionnaires have been answered by participants with the due attention. To investigate the reliability of our data, we compared the ratings obtained in the 10 IAPS and 10 IADS stimuli in our experiment with the values of valence, arousal, and dominance indicated in the IAPS and IADS manuals[9,13]. Results showed high significant correlations between our data and the expected value from the two manuals (IADS: r=0.86, p<0.01; IAPS: r=0.79, p<0.01). We also checked whether the responses distribution for each stimulus and each SAM dimension approximated normality. Table 3 (available online only) shows average ratings±standard deviation for each stimulus in the arousal, valence, and dominance dimensions. Table 4 (available online only) additionally shows results from a normality test (Kolmogorov–Smirnov test) and the skewness and kurtosis of their distributions.
Table 3

SAM results for each stimulus averaged across participants-

Stimulus numberArousal MeanStandard deviationValence MeanStandard deviationDominance MeanStandard deviation
167.75±21.2480.63±11.8557.75±25.97
269.19±22.4676.94±22.4654.68±22.46
353.7±24.5371.14±24.5357.77±24.53
458.32±24.3969.17±24.3955.69±24.39
556.48±21.5557.94±21.5559.9±21.55
672.41±17.4949.81±17.4945.95±17.49
745.32±25.0845.76±25.0853.81±25.08
857.37±24.7745.7±24.7751.76±24.77
950.51±22.3867.44±22.3866.22±22.38
1072.89±22.9570.02±22.9554.05±22.95
1164.31±21.7477.41±21.7457.84±21.74
1261.85±25.6126.8±25.6146.75±25.61
1347.8±23.2271±23.2264.25±23.22
1450.05±20.4664. 06±20.4663.54±20.46
1567.97±23.8118.54±23.8137.89±23.81
1664.56±21.529.34±21.543.7±21.5
1778.07±24.1714.39±24.1728.89±24.17
1851.54±22.8540.59±22.8562.16±22.85
1954.95±22.369.41±22.363.39±22.3
2059.67±20.2755.47±20.2759.22±20.27
2131.78±22.856.66±22.871.77±22.8
2230.35±23.2247.25±23.2271.14±23.22
2347.14±24.7361.52±24.7369.48±24.73
2437.56±22.0750.07±22.0768.47±22.07
2547.35±21.6960.96±21.6966.35±21.69
2636.96±24.453.89±24.471.05±24.4
2736.33±23.5648.9±23.5670.22±23.56
2837.26±23.1252.92±23.1268.72±23.12
2940.59±22.8851.37±22.8869.69±22.88
3047.39±26.464.25±26.469.58±26.4
3156.74±26.4425.84±26.4447.26±26.44
3262.75±23.7124.91±23.7143.21±23.71
3351.52±23.575.13±23.565±23.5
3453.75±22.4677.73±22.4659.73±22.46
3556.23±21.6623.91±21.6650.52±21.66
3663.79±22.5319.7±22.5347.29±22.53
3750.43±26.1278.89±26.1265.42±26.12
3826.1±22.7954.37±22.7979.36±22.79
3944.18±24.1557.49±24.1568.67±24.15
4050.44±21.9972.41±21.9969.53±21.99
4152.51±21.0365.94±21.0358.28±21.03
4244.94±21.557.38±21.562.51±21.5
4344.77±23.6160.28±23.6162.87±23.61
4455.24±22.8562.25±22.8556.74±22.85
4538.3±23.2854.6±23.2866.08±23.28
4644.93±22.7859.2±22.7862.69±22.78
4741.31±23.1754.81±23.1763.74±23.17
4840.31±21.8856.26±21.8865.13±21.88
4946.74±22.0157.83±22.0163.43±22.01
5036.64±24.1455.74±24.1467.15±24.14
Table 4

SAM results normality assessment-

 Kolmogorov-Smirnov normality test (p-values)Distribution SkewnessDistribution Kurtosis
StimulusArousalValenceDominanceArousalValenceDominanceArousalValenceDominance
18.14E-862.45E-891.44E-87−1.13−0.360.024.292.971.91
28.24E-862.45E-892.45E-89−1.14−0.760.184.093.391.8
38.24E-862.45E-892.45E-89−0.28−0.61−0.092.133.031.82
48.24E-862.45E-894.26E-89−0.5−0.68−0.022.652.651.91
51.40E-852.45E-891.44E-87−0.590.24−0.242.932.492.12
61.44E-872.45E-891.44E-87−1.090.070.55.682.132.15
72.55E-771.40E-851.44E-87−0.140.230.151.982.231.82
84.41E-844.26E-891.44E-87−0.510.340.362.572.061.92
92.45E-892.45E-892.48E-89−0.23−0.43−0.382.253.282.07
102.45E-891.44E-871.44E-87−1.3−0.92−0.014.363.321.89
118.14E-862.45E-892.45E-89−0.87−0.87−0.113.632.581.95
127.53E-841.93E-804.26E-89−0.850.470.433.043.042.05
132.29E-822.45E-892.45E-89−0.26−0.21−0.182.332.412.07
141.14E-802.45E-892.45E-89−0.67−0.15−0.172.772.182.03
152.29E-825.60E-665.55E-79−1.031.680.654.058.442.54
164.41E-841.93E-801.14E-80−1.011.210.384.215.482.45
174.41E-842.31E-632.31E-63−1.680.860.965.542.792.85
181.11E-751.46E-872.45E-89−0.71−0.27−0.072.874.171.97
191.44E-872.45E-892.45E-89−0.59−0.12−0.192.591.991.77
208.14E-861.44E-872.45E-89−0.76−0.270.183.392.61.83
213.15E-722.45E-892.45E-890.30.08−0.631.934.12.35
224.56E-691.44E-871.44E-870.530.24−0.692.165.282.46
232.52E-772.45E-891.44E-87−0.480.37−0.642.132.752.76
245.55E-791.44E-871.44E-87−0.020.16−0.462.133.642.47
257.53E-842.45E-891.44E-87−0.280.11−0.342.422.732.3
264.21E-771.44E-871.44E-870.160.19−0.592.084.052.68
271.22E-702.45E-892.48E-890.10.19−0.532.183.832.39
287.49E-712.45E-891.44E-87−0.070.38−0.531.833.492.35
299.21E-792.45E-894.26E-89−0.03−0.05−0.511.972.952.22
304.72E-742.45E-891.44E-87−0.230.26−0.542.212.12.57
312.29E-821.85E-643.90E-82−0.410.550.232.383.141.66
324.41E-841.85E-754.46E-84−0.790.430.423.033.052.06
331.16E-802.45E-892.45E-89−0.61−0.34−0.472.712.292.25
344.41E-842.45E-892.45E-89−0.5−0.75−0.232.692.981.84
352.29E-824.72E-748.24E-86−0.750.150.213.572.522.13
364.41E-841.85E-641.44E-87−0.960.560.383.712.51.96
374.41E-842.45E-894.26E-89−0.05−0.92−0.342.083.442.19
382.31E-632.45E-892.45E-890.551.03−1.032.044.683.17
391.13E-752.45E-898.14E-86−0.330.43−0.712.082.92.81
408.14E-862.45E-894.26E-89−0.35−0.06−0.662.721.992.67
418.24E-862.45E-892.45E-89−0.60.17−0.013.012.211.84
424.46E-842.45E-892.45E-89−0.330.38−0.042.433.691.75
431.58E-782.45E-892.45E-89−0.360.05−0.152.193.342.08
444.46E-842.45E-892.45E-89−0.7−0.210.262.952.721.86
459.21E-792.45E-892.45E-890.041.05−0.172.023.961.62
462.32E-822.45E-892.45E-89−0.050.52−0.22.582.911.99
472.49E-872.45E-892.45E-890.060.63−0.162.013.341.91
485.48E-792.45E-892.45E-890.030.31−0.112.123.111.84
493.90E-822.45E-892.45E-89−0.37−0.03−0.082.272.962.19
509.21E-791.44E-872.48E-890.39−0.26−0.52.514.572.46

Skin conductance signal processing

From each participant, we obtained two SC traces. The first trace was collected in absence of stimulation for about 60 s (.SC.baseline). The second trace was collected for the whole duration of the experiment. Both traces were examined by using the Ledalab MATLAB toolbox. Ledalab allows to visually inspect each single loaded trace and correct for movement artefacts. Before artefact correction, signal was downsampled (with a factor 8) to allow for a faster processing of the data. The movement artefacts were identified after careful visual inspection, and corrected by using a fitting spline. An example of such correction can be seen in Fig. 5.
Figure 5

Example of spline fitting to correct a signal artefact.

The SC signal (black trace) spikes are likely a result of the movement of the participant. The spike is identified through thorough visual inspection and a spline (red line) is fitted to the data to exclude the artefact and recover the signal.

SC signal ranged between 10.01 μS and 28.34 μS (mean: 15.55, sd: 1.67, value computed on SC traces rated one or more, see SC.quality). Collected SC values are in line with typical SC levels in humans, with baseline SC usually lower than the SC recording during the experiment[29]. Downsampled and artefacts corrected data are available at.SC.clean. Raw data at full sampling (512 Hz) are available at.SC.raw. Skin conductance signal is known to change in time [31,32]. Particularly, the tonic component of the signal (Fig. 3) is expected to vary[41] over relatively long periods of time. On the contrary, the phasic component of the signal, if correctly extracted from the raw data, is expected only to be tied to the eliciting events (that is: the emotional stimulus). However, it is important to consider that the effect of the triggering event on the SC signal might also decrease after several presentations of emotional stimuli, as the participant could habituate to the emotional stimulation itself. In other words, by being emotionally stimulated repeatedly, a participant could grow accustomed to the emotional impact of the eliciting stimuli, therefore not being scared, moved, or aroused by a stimulus as it would be if the stimulus was presented at the beginning of the experiment. We used Spearman rank-order correlation to assess whether the position that a stimulus held in the sequence of trials (that is, whether the stimulus was presented as first, second, third, etc.) correlated with lower (or higher) value of particular phasic features. Results are shown in Table 5. As it is possible to evince by the results in the table, correlations were low but nonetheless present, and users will have to take this fact into account when using the database.
Table 5

Spearman rank order correlations between features and the position stimuli were presented to participants.

Featurerhop-value(95% confidence interval)
CDA_nSCR−0.270<0.01
CDA_Latency−0.107<0.01
CDA_AmpSum−0.214<0.01
CDA_SCR0.0070.59
CDA_ISCR−0.263<0.01
CDA_PhasicMax−0.114<0.01

Usage Notes and Limitations of the database

The proposed database is available at (Data Citation 1) and can be used for several different applications. These data are of clear interest for different fields investigating human emotional reaction and automatic emotion recognition, from psychology to computer science. We strongly encourage the use of the stimuli contained in this database for experiments where emotional stimulation is needed, and particularly when emotional stimulation is not delivered through “traditional” means (such as pictures). In particular, the emotional ratings and reactions to haptic stimuli constitute a first, however small, strongly validated database for affective haptic. Abstract art pieces and instrumental pieces are also not conventional and validated stimuli. At the same time, SC traces and features could be used to train automatic systems in recognizing human emotions, given the sensory modality through which the stimulus has been delivered. Furthermore, the relationship between SAM ratings, SC features, and sensory modality involved in the stimulation could open new paths for the study of how emotions are communicated and interpreted by humans. In order to facilitate the access to the data within the database, a MATLAB function and its equivalent R function (SCR_statistics.m and SCRstatistics.R) have been prepared. These functions are available at (Data Citation 1). These functions allow to compute mean and standard deviation over the selected SCR features across all participants, for each different stimulus. Furthermore, these functions allow the user to select the participants over which compute the descriptive statistics based on the quality of their SC. However, we would also point out some of the flaws of the current database, so to help the future users in interpreting the results that are available for its analysis. First, this database is far from being complete and comprehensive of all the possible stimuli available in the three different sensory modalities. The number of conditions tested strongly reduced the number of samples we could use, as we wanted to maintain an experiment of a reasonable duration. Any generalization of the data contained in this database should therefore be limited at the stimuli hereby tested. Importantly, we encourage other scientists to expand the database themselves, still maintaining the same experimental design, where multisensory stimuli for eliciting emotions are delivered to users in a within-subjects experimental design. To facilitate the inclusion of further stimuli and participants responses to the database, we developed an R-based graphic user interface which allows other researchers to merge their new data to the R list. The graphic user interface is available at [Data Citation 1] and has been developed using the open source R package shiny. Such interface provides information about the format required to integrate the data and allows a database to be included by following a series of guided steps, even when the SC or the SAM fields are missing. Second, SCR were computed over different timeframes. Indeed, the response window in which SCRs were considered depend on the duration of the stimulus. The choice was motivated by the fact that for stimuli such as haptic and auditory (abstract) any truncation of the stimulus would have resulted in an uncomplete pattern, or melody, therefore (most likely) inducing frustration in the participant. Finally, our last concern relates to the haptic stimuli used. Although the positioning of the hand was relatively constant across trials, and facilitated by the foam support on top of the haptic device, the size of the hand was different across participants. This could lead the haptic stimuli to fall on slightly different locations on the hand depending on the participant. Therefore, haptic stimulation may not have been identical across participants, whereas audio and visual stimulation was.

Additional information

How to cite this article: Gatti, E, et al. Emotional ratings and skin conductance response to visual, auditory and haptic stimuli. Sci. Data 5:180120 doi: 10.1038/sdata.2018.120 (2018). Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
  11 in total

1.  The Emotional Movie Database (EMDB): a self-report and psychophysiological study.

Authors:  Sandra Carvalho; Jorge Leite; Santiago Galdo-Álvarez; Oscar F Gonçalves
Journal:  Appl Psychophysiol Biofeedback       Date:  2012-12

2.  The visual analog scale allows effective measurement of preoperative anxiety and detection of patients' anesthetic concerns.

Authors:  C H Kindler; C Harms; F Amsler; T Ihde-Scholl; D Scheidegger
Journal:  Anesth Analg       Date:  2000-03       Impact factor: 5.108

3.  Mapping the semantic space for the subjective experience of emotional responses to odors.

Authors:  Christelle Chrea; Didier Grandjean; Sylvain Delplanque; Isabelle Cayeux; Bénédicte Le Calvé; Laurence Aymard; Maria Inés Velazco; David Sander; Klaus R Scherer
Journal:  Chem Senses       Date:  2008-09-11       Impact factor: 3.160

4.  When feeling bad makes you look good: guilt, shame, and person perception.

Authors:  Deborah C Stearns; W Gerrod Parrott
Journal:  Cogn Emot       Date:  2012

5.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential.

Authors:  M M Bradley; P J Lang
Journal:  J Behav Ther Exp Psychiatry       Date:  1994-03

6.  Basic emotions evoked by odorants: comparison between autonomic responses and self-evaluation.

Authors:  O Alaoui-Ismaïli; O Robin; H Rada; A Dittmar; E Vernet-Maury
Journal:  Physiol Behav       Date:  1997-10

7.  The Geneva affective picture database (GAPED): a new 730-picture database focusing on valence and normative significance.

Authors:  Elise S Dan-Glauser; Klaus R Scherer
Journal:  Behav Res Methods       Date:  2011-06

8.  A continuous measure of phasic electrodermal activity.

Authors:  Mathias Benedek; Christian Kaernbach
Journal:  J Neurosci Methods       Date:  2010-05-06       Impact factor: 2.390

9.  Measures of emotion: A review.

Authors:  Iris B Mauss; Michael D Robinson
Journal:  Cogn Emot       Date:  2009-02-01

10.  Influence of Tempo and Rhythmic Unit in Musical Emotion Regulation.

Authors:  Alicia Fernández-Sotos; Antonio Fernández-Caballero; José M Latorre
Journal:  Front Comput Neurosci       Date:  2016-08-03       Impact factor: 2.380

View more
  5 in total

1.  Altered electromyographic responses to emotional and pain information in awake bruxers: case-control study.

Authors:  Xabier Soto-Goñi; María García-Gonzalez; Ignacio Ardizone-García; Teresa Sánchez-Sánchez; Laura Jiménez-Ortega
Journal:  Clin Oral Investig       Date:  2022-02-28       Impact factor: 3.573

2.  A dataset of continuous affect annotations and physiological signals for emotion analysis.

Authors:  Karan Sharma; Claudio Castellini; Egon L van den Broek; Alin Albu-Schaeffer; Friedhelm Schwenker
Journal:  Sci Data       Date:  2019-10-09       Impact factor: 6.444

3.  Misophonia: Analysis of the neuroanatomic patterns at the basis of psychiatric symptoms and changes of the orthosympathetic/ parasympathetic balance.

Authors:  Elena Grossini; Alessandro Stecco; Carla Gramaglia; Daniel De Zanet; Roberto Cantello; Benedetta Gori; Davide Negroni; Danila Azzolina; Daniela Ferrante; Alessandro Feggi; Alessandro Carriero; Patrizia Zeppegno
Journal:  Front Neurosci       Date:  2022-08-11       Impact factor: 5.152

4.  MsWH: A Multi-Sensory Hardware Platform for Capturing and Analyzing Physiological Emotional Signals.

Authors:  David Asiain; Jesús Ponce de León; José Ramón Beltrán
Journal:  Sensors (Basel)       Date:  2022-08-02       Impact factor: 3.847

Review 5.  Human Emotion Recognition: Review of Sensors and Methods.

Authors:  Andrius Dzedzickis; Artūras Kaklauskas; Vytautas Bucinskas
Journal:  Sensors (Basel)       Date:  2020-01-21       Impact factor: 3.576

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.