Literature DB >> 36161906

Data-driven emergence of convolutional structure in neural networks.

Alessandro Ingrosso1, Sebastian Goldt2.   

Abstract

Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.

Entities:  

Keywords:  convolution; invariance; neural networks; receptive fields

Mesh:

Year:  2022        PMID: 36161906      PMCID: PMC9546588          DOI: 10.1073/pnas.2201854119

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   12.779


  34 in total

1.  Synaptic mechanisms and network dynamics underlying spatial working memory in a cortical network model.

Authors:  A Compte; N Brunel; P S Goldman-Rakic; X J Wang
Journal:  Cereb Cortex       Date:  2000-09       Impact factor: 5.357

2.  Structure of receptive fields in area 3b of primary somatosensory cortex in the alert monkey.

Authors:  J J DiCarlo; K O Johnson; S S Hsiao
Journal:  J Neurosci       Date:  1998-04-01       Impact factor: 6.167

3.  A mathematical theory of semantic development in deep neural networks.

Authors:  Andrew M Saxe; James L McClelland; Surya Ganguli
Journal:  Proc Natl Acad Sci U S A       Date:  2019-05-17       Impact factor: 11.205

Review 4.  Going in circles is the way forward: the role of recurrence in visual inference.

Authors:  Ruben S van Bergen; Nikolaus Kriegeskorte
Journal:  Curr Opin Neurobiol       Date:  2020-12-03       Impact factor: 6.627

Review 5.  Understanding deep convolutional networks.

Authors:  Stéphane Mallat
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2016-04-13       Impact factor: 4.226

6.  Fast Recurrent Processing via Ventrolateral Prefrontal Cortex Is Needed by the Primate Ventral Stream for Robust Core Visual Object Recognition.

Authors:  Kohitij Kar; James J DiCarlo
Journal:  Neuron       Date:  2020-10-19       Impact factor: 17.173

7.  Modeling the impact of common noise inputs on the network activity of retinal ganglion cells.

Authors:  Michael Vidne; Yashar Ahmadian; Jonathon Shlens; Jonathan W Pillow; Jayant Kulkarni; Alan M Litke; E J Chichilnisky; Eero Simoncelli; Liam Paninski
Journal:  J Comput Neurosci       Date:  2011-12-29       Impact factor: 1.621

8.  Center-surround organization of auditory receptive fields in the owl.

Authors:  E I Knudsen; M Konishi
Journal:  Science       Date:  1978-11-17       Impact factor: 47.728

9.  Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons.

Authors:  Yan Karklin; Eero P Simoncelli
Journal:  Adv Neural Inf Process Syst       Date:  2011-12

10.  High-dimensional dynamics of generalization error in neural networks.

Authors:  Madhu S Advani; Andrew M Saxe; Haim Sompolinsky
Journal:  Neural Netw       Date:  2020-09-05
View more
  1 in total

1.  Data-driven emergence of convolutional structure in neural networks.

Authors:  Alessandro Ingrosso; Sebastian Goldt
Journal:  Proc Natl Acad Sci U S A       Date:  2022-09-26       Impact factor: 12.779

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.