| Literature DB >> 35185682 |
Judee K Burgoon1, Rebecca Xinran Wang2, Xunyu Chen2, Tina Saiying Ge2, Bradley Dorn2.
Abstract
Social relationships are constructed by and through the relational communication that people exchange. Relational messages are implicit nonverbal and verbal messages that signal how people regard one another and define their interpersonal relationships-equal or unequal, affectionate or hostile, inclusive or exclusive, similar or dissimilar, and so forth. Such signals can be measured automatically by the latest machine learning software tools and combined into meaningful factors that represent the socioemotional expressions that constitute relational messages between people. Relational messages operate continuously on a parallel track with verbal communication, implicitly telling interactants the current state of their relationship and how to interpret the verbal messages being exchanged. We report an investigation that explored how group members signal these implicit messages through multimodal behaviors measured by sensor data and linked to the socioemotional cognitions interpreted as relational messages. By use of a modified Brunswikian lens model, we predicted perceived relational messages of dominance, affection, involvement, composure, similarity and trust from automatically measured kinesic, vocalic and linguistic indicators. The relational messages in turn predicted the veracity of group members. The Brunswikian Lens Model offers a way to connect objective behaviors exhibited by social actors to the emotions and cognitions being perceived by other interactants and linking those perceptions to social outcomes. This method can be used to ascertain what behaviors and/or perceptions are associated with judgments of an actor's veracity. Computerized measurements of behaviors and perceptions can replace manual measurements, significantly expediting analysis and drilling down to micro-level measurement in a previously unavailable manner.Entities:
Keywords: affection; dominance; involvement; nervousness; nonverbal communication; relational communication; similarity; trust
Year: 2022 PMID: 35185682 PMCID: PMC8847219 DOI: 10.3389/fpsyg.2021.781487
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
FIGURE 1Brunswikian lens model of relational communication.
Significant linguistic, vocalic, and facial cues of dominance, affection, composure, involvement, similarity, and trust (p < 0.1).
| Constructs | Linguistic Cues | Vocalic Cues | Facial Cues |
| Dominance-Non-dominance | Number of Words (+) | Turn-at-talk duration (+) | Mean cheek raiser (−) |
| Affection-Hostility | Number of sentences (+) | Turn-at-talk duration (+) | Mean cheek raiser (−) |
| Composure-Nervousness | Disfluency ratio (−) | Average loudness (+) | Mean upper lip raiser (−) |
| Involvement-Detachment | Number of words (+) | Turn-at-talk duration (+) | Mean cheek raiser (−) |
| Similarity-Dissimilarity | Number of sentences (+) | Standard deviation of harmonic-to-noise ratio (+) | Mean inner brow raiser (−) |
| Trust-Distrust | Number of sentences (+) | Turn-at-talk duration (+) | Mean cheek raiser (−) |
Positive (and negative) signs in the parentheses indicate significant positive (or negative) unstandardized beta weights in regression analyses between the behavioral cue and the focal relational message construct.