| Literature DB >> 28358807 |
Liling Zheng1, Ping Huang1, Xiao Zhong1, Tianfeng Li1, Lei Mo1.
Abstract
Recent theories propose that language comprehension can influence perception at the low level of perceptual system. Here, we used an adaptation paradigm to test whether processing language caused color adaptation in the visual system. After prolonged exposure to a color linguistic context, which depicted red, green, or non-specific color scenes, participants immediately performed a color detection task, indicating whether they saw a green color square in the middle of a white screen or not. We found that participants were more likely to perceive the green color square after listening to discourses denoting red compared to discourses denoting green or conveying non-specific color information, revealing that language comprehension caused an adaptation aftereffect at the perceptual level. Therefore, semantic representation of color may have a common neural substrate with color perception. These results are in line with the simulation view of embodied language comprehension theory, which predicts that processing language reactivates the sensorimotor systems that are engaged during real experience.Entities:
Mesh:
Year: 2017 PMID: 28358807 PMCID: PMC5373511 DOI: 10.1371/journal.pone.0173755
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Frequency of each test stimulus.
| 0 | 1% | 2% | 3% | |
|---|---|---|---|---|
| Small (2.7×2.7) | 4 | 3 | 3 | 1 |
| Large (5.0×5.0) | 3 | 3 | 1 |
Mean accuracy of different color squares in three conditions (n = 32).
| size | blank | small | large | average | ||||
|---|---|---|---|---|---|---|---|---|
| alpha | 0 | 1 | 2 | 3 | 1 | 2 | 3 | |
| complementary | 94.5% | 71.9% | 86.5% | 90.6% | 17.7% | 62.5% | 81.3% | 60.6% |
| congruent | 85.0% | 62.1% | 82.3% | 90.6% | 20.8% | 57.9% | 87.5% | 58.6% |
| unrelated | 88.3% | 58.3% | 84.4% | 93.8% | 15.6% | 68.7% | 78.1% | 58.3% |
Mean median response times (ms) of different color squares in three conditions (n = 32).
| size | blank | small | large | average | ||||
|---|---|---|---|---|---|---|---|---|
| alpha | 0 | 1 | 2 | 3 | 1 | 2 | 3 | |
| complementary | 1002.2 | 971.8 | 909.2 | 916.5 | 1058.2 | 1016.1 | 1125.2 | 995.3 |
| congruent | 1019.2 | 1022.8 | 903.3 | 917.5 | 1071.2 | 1023.8 | 1033.6 | 1001.7 |
| unrelated | 1031.0 | 978.5 | 969.4 | 867.1 | 1076.8 | 1004.3 | 1006.5 | 1005.1 |
Mean accuracy for each level of transparency in three conditions (n = 32).
| alpha | 0 | 1 | 2 | 3 |
|---|---|---|---|---|
| complementary | 94.5% | 44.8% | 74.5% | 85.9% |
| congruent | 85.0% | 40.6% | 50.7% | 54.9% |
| unrelated | 88.3% | 37.0% | 76.1% | 85.9% |
Estimates of P(hit) and P(false) in three conditions (n = 32).
| P(hit) | P(false) | |||||
|---|---|---|---|---|---|---|
| complementary | congruent | unrelated | complementary | congruent | unrelated | |
| M | 0.63 | 0.60 | 0.61 | 0.05 | 0.15 | 0.12 |
| SD | 0.19 | 0.17 | 0.19 | 0.14 | 0.22 | 0.17 |
Estimates of A’ and B” in three conditions (n = 32).
| A’ | B” | |||||
|---|---|---|---|---|---|---|
| complementary | congruent | unrelated | complementary | congruent | unrelated | |
| M | 0.88 | 0.82 | 0.84 | 0.84 | 0.64 | 0.64 |
| SD | 0.10 | 0.13 | 0.11 | 0.39 | 0.47 | 0.48 |
Note. Higher A’ values indicate a better discrimination ability, and higher B” values indicate a relative bias to response “No,” whereas smaller B” values indicate a relative bias to response “Yes”.
Fig 1Results of A’ and B”.
“R” means red discourse, representing the complementary condition. “G” means green discourse, representing the congruent condition; “S” means neutral discourse, representing the unrelated condition.