| Literature DB >> 33458615 |
Tim Whiting1, Alvika Gautam1, Jacob Tye1, Michael Simmons1, Jordan Henstrom2, Mayada Oudah3, Jacob W Crandall1.
Abstract
Many technical and psychological challenges make it difficult to design machines that effectively cooperate with people. To better understand these challenges, we conducted a series of studies investigating human-human, robot-robot, and human-robot cooperation in a strategically rich resource-sharing scenario, which required players to balance efficiency, fairness, and risk. In these studies, both human-human and robot-robot dyads typically learned efficient and risky cooperative solutions when they could communicate. In the absence of communication, robot dyads still often learned the same efficient solution, but human dyads achieved a less efficient (less risky) form of cooperation. This difference in how people and machines treat risk appeared to discourage human-robot cooperation, as human-robot dyads frequently failed to cooperate without communication. These results indicate that machine behavior should better align with human behavior, promoting efficiency while simultaneously considering human tendencies toward risk and fairness.Entities:
Keywords: Human-Computer Interaction; Psychology; Social Sciences
Year: 2020 PMID: 33458615 PMCID: PMC7797565 DOI: 10.1016/j.isci.2020.101963
Source DB: PubMed Journal: iScience ISSN: 2589-0042