| Literature DB >> 27909400 |
Abstract
Synaptic plasticity is widely considered to be the neurobiological basis of learning and memory by neuroscientists and researchers in adjacent fields, though diverging opinions are increasingly being recognized. From the perspective of what we might call "classical cognitive science" it has always been understood that the mind/brain is to be considered a computational-representational system. Proponents of the information-processing approach to cognitive science have long been critical of connectionist or network approaches to (neuro-)cognitive architecture, pointing to the shortcomings of the associative psychology that underlies Hebbian learning as well as to the fact that synapses are practically unfit to implement symbols. Recent work on memory has been adding fuel to the fire and current findings in neuroscience now provide first tentative neurobiological evidence for the cognitive scientists' doubts about the synapse as the (sole) locus of memory in the brain. This paper briefly considers the history and appeal of synaptic plasticity as a memory mechanism, followed by a summary of the cognitive scientists' objections regarding these assertions. Next, a variety of tentative neuroscientific evidence that appears to substantiate questioning the idea of the synapse as the locus of memory is presented. On this basis, a novel way of thinking about the role of synaptic plasticity in learning and memory is proposed.Entities:
Keywords: cognitive science; hebbian learning; learning; long-term potentiation; memory; memory mechanisms; synaptic plasticity; synaptic turnover
Year: 2016 PMID: 27909400 PMCID: PMC5112247 DOI: 10.3389/fnsys.2016.00088
Source DB: PubMed Journal: Front Syst Neurosci ISSN: 1662-5137
Nomenclature.
| Finite-state machine | An abstract machine that can be in only one state at a time and a finite number of states in total. Its memory is defined by the number states available. |
| Turing machine | A finite-state machine extended with a so-called tape. The tape is a read/write memory component where symbols can be stored and recovered. |
| Turing completeness | Refers to the ability of a given set of instructions to simulate a Turing machine. |
| von-Neumann implementation | Denotes a common schematic circuit concept (and its many offshoots) that actually implement a universal Turing machine. |
This table provides brief expositions of terms and concepts from theory of computation that might not be familiar to all readers. Note that these are working definitions for the purpose of this paper, they are not meant to be exhaustive.