| Literature DB >> 27417874 |
Melanie Sauerland1, Linsey H C Raymaekers1, Henry Otgaar1,2, Amina Memon3, Thijs T Waltjen1, Maud Nivo1, Chiel Slegers1, Nick J Broers4, Tom Smeets1.
Abstract
In the eyewitness identification literature, stress and arousal at the time of encoding are considered to adversely influence identification performance. This assumption is in contrast with findings from the neurobiology field of learning and memory, showing that stress and stress hormones are critically involved in forming enduring memories. This discrepancy may be related to methodological differences between the two fields of research, such as the tendency for immediate testing or the use of very short (1-2 hours) retention intervals in eyewitness research, while neurobiology studies insert at least 24 hours. Other differences refer to the extent to which stress-responsive systems (i.e., the hypothalamic-pituitary-adrenal axis) are stimulated effectively under laboratory conditions. The aim of the current study was to conduct an experiment that accounts for the contemporary state of knowledge in both fields. In all, 123 participants witnessed a live staged theft while being exposed to a laboratory stressor that reliably elicits autonomic and glucocorticoid stress responses or while performing a control task. Salivary cortisol levels were measured to control for the effectiveness of the stress induction. One week later, participants attempted to identify the thief from target-present and target-absent line-ups. According to regression and receiver operating characteristic analyses, stress did not have robust detrimental effects on identification performance.Entities:
Keywords: Stress; and eyewitness identification performance; stress-induced cortisol responses
Mesh:
Substances:
Year: 2016 PMID: 27417874 PMCID: PMC5129533 DOI: 10.1002/bsl.2249
Source DB: PubMed Journal: Behav Sci Law ISSN: 0735-3936
Figure 1Salivary cortisol levels (nmol/L) over time for: (a) high responders (high‐stress condition), low responders (high‐stress condition) and low‐stress participants; and (b) the high‐ and low‐stress conditions. T pre‐stress, measurement before administration of the Maastricht Acute Stress Test (MAST)/control task; t +0min, measurement upon termination of the MAST/control task; t +10min, measurement 10 min after termination of the MAST/control task; t +20min, measurement 20 min after termination of the MAST/control task.
Figure 2Receiver operating characteristic (ROC) plots for the high‐ and low‐stress conditions. The dashed line represents chance performance.
Figure 3Confidence–accuracy characteristic (CAC) analysis for the high‐ and low‐stress conditions. Low confidence refers to 30–60% confidence (suspect selections with confidence < 30% did not occur), medium confidence refers to 70–80% confidence, and high confidence to 90–100%. The bars represent standard error bars.
Percentage of identification outcomes for the high‐ and low‐stress conditions
| Stress condition | High stress | Low stress | Total |
|---|---|---|---|
| Target‐present line‐ups |
|
|
|
| Hits (correct identifications) | 53.3 | 53.3 | 53.3 |
| Foil choices | 6.7 | 0.0 | 3.3 |
| False rejections | 33.3 | 40.0 | 36.7 |
| Don't know responses | 6.7 | 6.7 | 6.7 |
| Target‐absent line‐ups |
|
|
|
| Correct rejections | 59.4 | 80.6 | 69.8 |
| False alarms | 25.0 | 12.9 | 19.0 |
| Don't know responses | 15.6 | 6.5 | 11.1 |