Literature DB >> 21031153

Active Information Selection: Visual Attention Through the Hands.

Chen Yu1, Linda B Smith, Hongwei Shen, Alfredo F Pereira, Thomas Smith.   

Abstract

An important goal in studying both human intelligence and artificial intelligence is to understand how a natural or an artificial learning system deals with the uncertainty and ambiguity of the real world. For a natural intelligence system such as a human toddler, the relevant aspects in a learning environment are only those that make contact with the learner's sensory system. In real-world interactions, what the child perceives critically depends on his own actions as these actions bring information into and out of the learner's sensory field. The present analyses indicate how, in the case of a toddler playing with toys, these perception-action loops may simplify the learning environment by selecting relevant information and filtering irrelevant information. This paper reports new findings using a novel method that seeks to describe the visual learning environment from a young child's point of view and measures the visual information that a child perceives in real-time toy play with a parent. The main results are 1) what the child perceives primarily depends on his own actions but also his social partner's actions; 2) manual actions, in particular, play a critical role in creating visual experiences in which one object dominates; 3) this selecting and filtering of visual objects through the actions of the child provides more constrained and clean input that seems likely to facilitate cognitive learning processes. These findings have broad implications for how one studies and thinks about human and artificial learning systems.

Entities:  

Year:  2009        PMID: 21031153      PMCID: PMC2964141          DOI: 10.1109/TAMD.2009.2031513

Source DB:  PubMed          Journal:  IEEE Trans Auton Ment Dev        ISSN: 1943-0604


  25 in total

1.  Artificial intelligence. Autonomous mental development by robots and animals.

Authors:  J Weng; J McClelland; A Pentland; O Sporns; I Stockman; M Sur; E Thelen
Journal:  Science       Date:  2001-01-26       Impact factor: 47.728

Review 2.  Deictic codes for the embodiment of cognition.

Authors:  D H Ballard; M M Hayhoe; P K Pook; R P Rao
Journal:  Behav Brain Sci       Date:  1997-12       Impact factor: 12.579

3.  A perception--action perspective on tool use development.

Authors:  J J Lockman
Journal:  Child Dev       Date:  2000 Jan-Feb

4.  Beyond words: the importance of gesture to researchers and learners.

Authors:  S Goldin-Meadow
Journal:  Child Dev       Date:  2000 Jan-Feb

5.  Development of object concepts in infancy: Evidence for early learning in an eye-tracking paradigm.

Authors:  Scott P Johnson; Dima Amso; Jonathan A Slemmer
Journal:  Proc Natl Acad Sci U S A       Date:  2003-08-25       Impact factor: 11.205

6.  What's in View for Toddlers? Using a Head Camera to Study Visual Experience.

Authors:  Hanako Yoshida; Linda B Smith
Journal:  Infancy       Date:  2008-05

Review 7.  Language, gesture, and the developing brain.

Authors:  Elizabeth Bates; Frederic Dick
Journal:  Dev Psychobiol       Date:  2002-04       Impact factor: 3.038

Review 8.  Are developmental disorders like cases of adult brain damage? Implications from connectionist modelling.

Authors:  Michael Thomas; Annette Karmiloff-Smith
Journal:  Behav Brain Sci       Date:  2002-12       Impact factor: 12.579

9.  Ape gestures and language evolution.

Authors:  Amy S Pollick; Frans B M de Waal
Journal:  Proc Natl Acad Sci U S A       Date:  2007-04-30       Impact factor: 11.205

Review 10.  Grasping objects: the cortical mechanisms of visuomotor transformation.

Authors:  M Jeannerod; M A Arbib; G Rizzolatti; H Sakata
Journal:  Trends Neurosci       Date:  1995-07       Impact factor: 13.837

View more
  17 in total

1.  The Signal in the Noise: The Visual Ecology of Parents' Object Naming.

Authors:  Sumarga H Suanda; Meagan Barnhart; Linda B Smith; Chen Yu
Journal:  Infancy       Date:  2018-12-25

2.  Embodied attention and word learning by toddlers.

Authors:  Chen Yu; Linda B Smith
Journal:  Cognition       Date:  2012-08-09

3.  Hand-Eye Coordination Predicts Joint Attention.

Authors:  Chen Yu; Linda B Smith
Journal:  Child Dev       Date:  2017-02-10

4.  Self-generated variability in object images predicts vocabulary growth.

Authors:  Lauren K Slone; Linda B Smith; Chen Yu
Journal:  Dev Sci       Date:  2019-04-03

5.  Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention.

Authors:  Chen Yu; Linda B Smith
Journal:  Cogn Sci       Date:  2016-03-25

6.  Visual-motor coordination in natural reaching of young children and adults.

Authors:  John M Franchak; Chen Yu
Journal:  Cogsci       Date:  2015-07

7.  The Multisensory Nature of Verbal Discourse in Parent-Toddler Interactions.

Authors:  Sumarga H Suanda; Linda B Smith; Chen Yu
Journal:  Dev Neuropsychol       Date:  2017-01-27       Impact factor: 2.253

8.  It's all connected: Pathways in visual object recognition and early noun learning.

Authors:  Linda B Smith
Journal:  Am Psychol       Date:  2013-11

9.  The Social Origins of Sustained Attention in One-Year-Old Human Infants.

Authors:  Chen Yu; Linda B Smith
Journal:  Curr Biol       Date:  2016-04-28       Impact factor: 10.834

10.  A bottom-up view of toddler word learning.

Authors:  Alfredo F Pereira; Linda B Smith; Chen Yu
Journal:  Psychon Bull Rev       Date:  2014-02
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.