Literature DB >> 32970997

Artificial Neural Networks for Neuroscientists: A Primer.

Guangyu Robert Yang1, Xiao-Jing Wang2.   

Abstract

Artificial neural networks (ANNs) are essential tools in machine learning that have drawn increasing attention in neuroscience. Besides offering powerful techniques for data analysis, ANNs provide a new approach for neuroscientists to build models for complex behaviors, heterogeneous neural activity, and circuit connectivity, as well as to explore optimization in neural systems, in ways that traditional models are not designed for. In this pedagogical Primer, we introduce ANNs and demonstrate how they have been fruitfully deployed to study neuroscientific questions. We first discuss basic concepts and methods of ANNs. Then, with a focus on bringing this mathematical framework closer to neurobiology, we detail how to customize the analysis, structure, and learning of ANNs to better address a wide range of challenges in brain research. To help readers garner hands-on experience, this Primer is accompanied with tutorial-style code in PyTorch and Jupyter Notebook, covering major topics.
Copyright © 2020 Elsevier Inc. All rights reserved.

Mesh:

Year:  2020        PMID: 32970997     DOI: 10.1016/j.neuron.2020.09.005

Source DB:  PubMed          Journal:  Neuron        ISSN: 0896-6273            Impact factor:   17.173


  17 in total

Review 1.  Understanding ethanol's acute effects on medial prefrontal cortex neural activity using state-space approaches.

Authors:  Mitchell D Morningstar; William H Barnett; Charles R Goodlett; Alexey Kuznetsov; Christopher C Lapish
Journal:  Neuropharmacology       Date:  2021-09-01       Impact factor: 5.273

Review 2.  Interpreting neural computations by examining intrinsic and embedding dimensionality of neural activity.

Authors:  Mehrdad Jazayeri; Srdjan Ostojic
Journal:  Curr Opin Neurobiol       Date:  2021-09-17       Impact factor: 7.070

3.  The role of population structure in computations through neural dynamics.

Authors:  Alexis Dubreuil; Adrian Valente; Manuel Beiran; Francesca Mastrogiuseppe; Srdjan Ostojic
Journal:  Nat Neurosci       Date:  2022-06-06       Impact factor: 28.771

4.  The Spatiotemporal Neural Dynamics of Intersensory Attention Capture of Salient Stimuli: A Large-Scale Auditory-Visual Modeling Study.

Authors:  Qin Liu; Antonio Ulloa; Barry Horwitz
Journal:  Front Comput Neurosci       Date:  2022-05-12       Impact factor: 3.387

Review 5.  Navigating the Statistical Minefield of Model Selection and Clustering in Neuroscience.

Authors:  Bálint Király; Balázs Hangya
Journal:  eNeuro       Date:  2022-07-14

Review 6.  50 years of mnemonic persistent activity: quo vadis?

Authors:  Xiao-Jing Wang
Journal:  Trends Neurosci       Date:  2021-10-12       Impact factor: 16.978

7.  PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.

Authors:  Daniel B Ehrlich; Jasmine T Stone; David Brandfonbrener; Alexander Atanasov; John D Murray
Journal:  eNeuro       Date:  2021-01-15

8.  Deep learning-based pupil model predicts time and spectral dependent light responses.

Authors:  Babak Zandi; Tran Quoc Khanh
Journal:  Sci Rep       Date:  2021-01-12       Impact factor: 4.379

9.  Lessons From Deep Neural Networks for Studying the Coding Principles of Biological Neural Networks.

Authors:  Hyojin Bae; Sang Jeong Kim; Chang-Eop Kim
Journal:  Front Syst Neurosci       Date:  2021-01-15

10.  A convolutional neural-network framework for modelling auditory sensory cells and synapses.

Authors:  Fotios Drakopoulos; Deepak Baby; Sarah Verhulst
Journal:  Commun Biol       Date:  2021-07-01
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.