Literature DB >> 34433669

Low-temperature emergent neuromorphic networks with correlated oxide devices.

Uday S Goteti1, Ivan A Zaluzhnyy1, Shriram Ramanathan2, Robert C Dynes3, Alex Frano3.   

Abstract

Neuromorphic computing-which aims to mimic the collective and emergent behavior of the brain's neurons, synapses, axons, and dendrites-offers an intriguing, potentially disruptive solution to society's ever-growing computational needs. Although much progress has been made in designing circuit elements that mimic the behavior of neurons and synapses, challenges remain in designing networks of elements that feature a collective response behavior. We present simulations of networks of circuits and devices based on superconducting and Mott-insulating oxides that display a multiplicity of emergent states that depend on the spatial configuration of the network. Our proposed network designs are based on experimentally known ways of tuning the properties of these oxides using light ions. We show how neuronal and synaptic behavior can be achieved with arrays of superconducting Josephson junction loops, all within the same device. We also show how a multiplicity of synaptic states could be achieved by designing arrays of devices based on hydrogenated rare earth nickelates. Together, our results demonstrate a research platform that utilizes the collective macroscopic properties of quantum materials to mimic the emergent behavior found in biological systems.
Copyright © 2021 the Author(s). Published by PNAS.

Entities:  

Keywords:  emergent phenomena; hardware neural networks; neuromorphic computing; strongly correlated systems

Year:  2021        PMID: 34433669      PMCID: PMC8536335          DOI: 10.1073/pnas.2103934118

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   11.205


Emergent behavior, defined as when a system’s collective behavior is different from that of its individual constituents, is widespread and consequential in nature. The brain, for example, is made up of proteins, tissue, and flowing chemicals and charges but functions collectively over a long spatial range as a uniquely powerful and energy-efficient machine. Remarkably, its constituent elements and global architecture feature staggering amounts of seeming randomness and disorder (Fig. 1), yet it has evolved to be capable of astounding computational functionalities: in many ways, still more impressive and energy-efficient than semiconductor-based computers. Analogously, quantum materials, such as strongly correlated systems (1), display collective macroscopic behavior such as superconductivity (2) and metal−insulator transitions (3). These macroscopic collective responses emerge from microscopic quantum mechanical interactions. As a result, brain-inspired computing paradigms—known broadly as neuromorphic (4, 5)—based on these quantum materials are prominent in the goals of various research efforts to explore and hopefully spawn the next technological revolution (6–10).
Fig. 1.

A comparison of the emergent behavior that arises in biological systems, including simple living organisms (Left) and artificial (Right) devices. Disorder and randomness play roles at all length scales. In the case of correlated oxides (Right), disorder in the lattice can yield different macroscopic properties that can be used to make devices by controlled light-ion modifications. Moreover, a randomly designed network of said devices can yield exponentially more complex, emergent responses.

A comparison of the emergent behavior that arises in biological systems, including simple living organisms (Left) and artificial (Right) devices. Disorder and randomness play roles at all length scales. In the case of correlated oxides (Right), disorder in the lattice can yield different macroscopic properties that can be used to make devices by controlled light-ion modifications. Moreover, a randomly designed network of said devices can yield exponentially more complex, emergent responses. Natural neural networks in animal brains comprise neurons that are interconnected by synapses. Neurons are capable of integrating charges and releasing them at critical thresholds referred to as action potentials. Synapses can amplify or decrease the signal strength by chemical or electrical pathways (11). Information is encoded temporally in such networks and represents a paradigm distinct from traditional digital electronics. Synapses store memory and can dynamically adjust their weight in response to the time intervals between neuronal stimuli (known as time-dependent plasticity). Neuromorphic hardware networks therefore aspire to capture the key features found in the nervous system such as periodic trains of spiking signals and multistate memory that can be programmed incrementally as well as in a time-dependent manner. Materials that can host diverse electronic structures and/or present nonlinear electrical characteristics often are promising candidate systems to explore as building blocks for neuromorphic networks. Further, to emulate the complexity of natural networks, having multiple control knobs via ionic or electronic inputs to tune the order parameter in neuromorphic devices is desirable. Neuromorphic computing architectures based on correlated transition metal oxides could offer a flexible, low-consumption alternative to von Neumann architectures. The key challenge is to generate flexible material response properties that can harbor multiple states to allow flexibility and an architecture for computation and memory to operate in parallel. Correlated oxides offer a platform to explore high-density and multistate memory because of the opportunity to apply their multiple phases including superconductivity, magnetism, and metal−insulator transitions. While low-temperature complementary metal-oxide-semiconductor (CMOS) and cryocooling have long been an active area of research (12–14), cryoelectronic neuromorphic device technologies based on superconducting materials are gaining interest rapidly due to their unique advantages in power efficiency and in flexibility to generate spiking neuron-like behavior (15–26). In this paper, we report results of simulations that demonstrate classes of low-temperature artificial neural networks that arise from designing controlled disorder in devices based on correlated oxides. We utilize two archetypal properties for this purpose, namely, superconductivity and metal–insulator transitions, with the common theme of creating controlled disorder with light ion incorporation. First, we will discuss how lattice disorder in superconducting (YBCO) induced by helium ion implantation can be used to fabricate arrays of superconducting Josephson junction loops that yield an exponential multiplicity of coherent states. This emergent multiplicity can further evolve by randomizing the array’s spatial geometry. The disordered superconducting loops are fast and allow multiple states that have a transient nature with low energy consumption. Second, we discuss how arrays of hydrogen ion incorporated rare earth nickelate devices, each of which are known to individually render synaptic behavior, can also yield a multiplicity of states that rely on the spatial configuration and design of the array. These responses are slower than the superconducting loops but more stable in the various states, and are more memory centric. While the different oxide material systems possess different microscopic behaviors, we illustrate that the flexibility of these two case study systems allows high-density architectures with configurational randomness that yield many possible states that could mimic the animal brain’s notable random-to-collective behavior. Moreover, our simulations outline a broad research effort to harness the properties of strongly correlated electron systems to produce arrays of disordered individual brain-inspired elements that collectively evolve to render emergent functionalities. These quantum materials feature a large range of systems with “binary” states that can be employed, and we show how these can be configured to construct neurons and synapses on the same device. While the two examples we present are near the ends of the spectra of speed, energy, and stability, the range of properties in oxides allows a wide range of speed, power consumption, and volatility, that is, long-term stability or transient volatility. With the wealth of oxides currently being studied throughout a large community, we expect many more systems following the architectural framework that this paper discusses. This paper is organized as follows. First, we summarize the experimentally known properties that enable our neuromorphic simulations—superconductivity and metal−insulator transitions—particularly focusing on the effect that light ions have on them. Then, we discuss how superconducting Josephson junction loops could render neuronal behavior in copper oxide devices. Next, we discuss how synaptic behavior can be achieved in two platforms. Finally, we look outward by presenting examples of connectivity between the devices we propose and other material platforms.

Quantum Material Platforms

The emergent neuromorphic models we outline below are founded on two archetypal collective electronic phases that occur in transition metal oxides, namely, high-temperature superconductivity and Mott metal−insulator transitions. Specifically, we focus on copper oxides and rare earth nickelates, respectively, although, in the case of metal−insulator materials, there are several other oxides that can be used. Importantly, there is ample experimental evidence showing that both of these phases can be tuned by using light ions such as H, Li, and/or He. Thus, before describing our neuromorphic models, we first summarize the effect of light ions. The superconductivity and normal-state transport properties of high-Tc superconductor YBCO have been shown to be highly sensitive to ion irradiation in various studies (27–30). Particularly, the material undergoes a continuous transition from metallic behavior in the normal state to insulating behavior as the irradiation dose of ions is increased (28). These effects are explained as due to induced disorder by the ion bombardment rather than doping as evident in previous reports (27, 28, 31). Therefore, planar superconducting tunnel junctions have been constructed by using modern focused He-ion beams with a beam diameter, and therefore the tunnel barrier width, of 500 pm (32). With the appropriate choice of ion dosages, superconductor−insulator−superconductor and superconductor−normal−superconductor junctions can be constructed. The ability to produce Josephson tunnel junctions of arbitrary shape and size with tunable junction properties such as critical current density opens a possibility to explore novel neuromorphic devices. Superconducting Josephson tunnel junctions, formed using damage induced by a focused ion beam on YBCO thin films (32), can be viewed as relaxation oscillators. When a current through the tunnel barrier exceeds the critical current of the junction, the phase difference of the superconducting order parameter oscillates with a frequency proportional to the voltage developed across the junction. These oscillations in YBCO tunnel junctions using focused ion beams (32) can be sufficiently damped to produce nonhysteretic current−voltage characteristics, and therefore can produce spiking voltage, with each spike corresponding to a single-flux quantum vortex with a total change in the phase difference of (33). The relaxation oscillations and spiking characteristics of Josephson junctions have been of interest in different artificial neural networks (16, 21, 34, 35). Furthermore, alternative devices to generate spiking neuron functionality using superconducting nanowire-based devices have also been proposed (18, 24). However, the disordered YBCO junctions can be particularly suitable for superconducting neurons and disordered array synaptic network elements as described below. Specifically, a single array can be configured to achieve both neuron and synapse operation. Additionally, several such neurons can be connected through the proposed disordered array synaptic networks to form recurrent neural networks with all the comprising elements evolving together. Furthermore, such networks can be interconnected through similar disordered arrays to form a hierarchical recurrent network similar to a biological brain (36). In the case of the rare earth nickelates, , one can influence the metal−insulator transition (MIT) by electron doping of the material independent of temperature (37–39). The addition of one extra electron to the nickel ion induces a large Mott–Hubbard splitting due to strong Coulomb interaction between the electrons on the same orbital of Ni. As a result, a large 3-eV electronic band gap opens between the highest occupied orbitals of O and lowest unoccupied orbitals of Ni, which leads to a dramatic increase of resistance (9). Experimentally, this effect was demonstrated by doping and with hydrogen (8, 9, 37, 40, 41) and lithium ions (9, 42). In these works, doping with hydrogen was performed by depositing Pd or Pt on top of the nickelate film and subsequent one-time annealing in hydrogen atmosphere. During this process, the hydrogen molecules catalytically dissociate into individual atoms which diffuse into the nickelate film. Homogeneously doping the material with hydrogen can yield a nonvolatile increase in electrical resistance (9). Recently, X-ray absorption and diffraction experiments have shown that the main mechanism of such colossal resistance change during hydrogenation lies in bringing in an extra electron to the nickel ions, while the changes in the crystal structure due to doping are very small (43). Since the electronic phase transition and concurrent resistance tuning is independent of temperature, it is possible to couple nickelate synapses with superconducting junctions to create neural networks.

Simulations of Loop Neuron Structures with Ion-Damaged Tunnel Junctions

The spiking behavior in neurons and synapses is realized through generation and propagation of single-flux quanta in overdamped Josephson junctions and superconducting loops incorporating these junctions, respectively. The flux shuttle (44) structure describes the vortex dynamics and the corresponding Josephson phase oscillations in junctions and superconducting loops. Such vortex motion in superconducting loops is being employed in digital computing systems such as single-flux quantum logic circuits (33). Therefore, several of the well-established design and hardware ideas used in those systems may also be useful in developing neurons and synapses presented here. An integrate-and-fire neuron behavior can be realized by appropriately designing these loops and junctions. An example is shown in Fig. 2 that produces a spiking voltage output when the current in the integrating superconducting loop exceeds the critical current of the output tunnel junction. The operation of the structure in Fig. 2 is discussed in detail in ref. 36, where the neuron is shown to exhibit firing dynamics of a leaky integrate-and-fire neuron.
Fig. 2.

Schematic of a spiking integrate-and-fire neuron with a large superconducting integrating loop, with Josephson junctions generated using focused He-ion tunnel barriers. Ion-damaged barriers are shown in yellow and YBCO film shown in purple.

Schematic of a spiking integrate-and-fire neuron with a large superconducting integrating loop, with Josephson junctions generated using focused He-ion tunnel barriers. Ion-damaged barriers are shown in yellow and YBCO film shown in purple. The parameters of the neuron, such as the maximum current in the integration loop before the neuron fires, the time constants of the decay of the loop current, and the resting potential, can be controlled using the size of the loop and the parameters of junctions in it as demonstrated by the simulation results in Fig. 3. An example of the current−voltage characteristics of one of the Josephson junctions used in the neuron is shown in Fig. 3. The junctions are overdamped, thereby producing single-flux quantum voltage spikes when excited with a current pulse with magnitude above their critical currents as shown in Fig. 3. Therefore, as the current in the integration loop increases beyond the junction critical current, a spiking output is produced. Different neurons in a neural network can be designed accordingly by simply adjusting the loop size and the ion damage of the tunnel barrier that defines the junction critical current. A spike train of constant frequency is applied at the input of the neuron as shown in Fig. 3, which enters the integration loop as each spike switches the input junction. The current in the loop reaches a threshold defined by the critical currents of the output junction stack as the input spikes enter the loop. The threshold can be dynamically varied within a range defined by the DC current input shown in Fig. 2. The spiking output of the neuron is shown in Fig. 3, as the threshold is gradually varied. As the DC current input is increased, the output spike frequency increases, exhibiting a current versus frequency behavior of that of an ideal integrate-and-fire neuron. Results of three different geometries and junction parameters are shown in Fig. 3. Additionally, the maximum threshold of the neuron is also defined by the physical design of the structure as shown in Fig. 3.
Fig. 3.

Simulation results of the spiking integrate-and-fire neuron shown in Fig. 2. Critical currents of all the junctions in the loop are chosen to be 100 A, and the total inductance of the loop is 100 pH. (A) Spike train of constant frequency of 250 MHz applied to the spiking input terminal of Fig. 2. (B) Output spikes across the spiking output terminal of Fig. 2 fired by the neuron as the current in the integrating loop reached a threshold. A spiking frequency of the output is 25 MHz, yielding a threshold of 10. (C) Input current versus output frequency of the superconducting neurons of different sizes and number of junctions in the stack. The behavior represents that of a leaky integrate-and-fire neuron. (D) Zoomed-in view of input current versus output frequency to show that different threshold behaviors can be achieved by appropriately designing the size of the loop and the junction stack. For C and D: a, 10 junctions in the loop with a loop inductance of 100 pH; b, 10 junctions in the loop with a loop inductance of 200 pH; and c, 8 junctions with a loop inductance of 100 pH. (E) Typical current–voltage (I-V) characteristics of the Josephson junction used in the integrate-and-fire neuron. The junction is excited with a current pulse above its critical current to generate a single-flux quantum spike. (F) Example of a single spike generated across Josephson junction by exciting the junction with a short current pulse of 300 A.

Simulation results of the spiking integrate-and-fire neuron shown in Fig. 2. Critical currents of all the junctions in the loop are chosen to be 100 A, and the total inductance of the loop is 100 pH. (A) Spike train of constant frequency of 250 MHz applied to the spiking input terminal of Fig. 2. (B) Output spikes across the spiking output terminal of Fig. 2 fired by the neuron as the current in the integrating loop reached a threshold. A spiking frequency of the output is 25 MHz, yielding a threshold of 10. (C) Input current versus output frequency of the superconducting neurons of different sizes and number of junctions in the stack. The behavior represents that of a leaky integrate-and-fire neuron. (D) Zoomed-in view of input current versus output frequency to show that different threshold behaviors can be achieved by appropriately designing the size of the loop and the junction stack. For C and D: a, 10 junctions in the loop with a loop inductance of 100 pH; b, 10 junctions in the loop with a loop inductance of 200 pH; and c, 8 junctions with a loop inductance of 100 pH. (E) Typical current–voltage (I-V) characteristics of the Josephson junction used in the integrate-and-fire neuron. The junction is excited with a current pulse above its critical current to generate a single-flux quantum spike. (F) Example of a single spike generated across Josephson junction by exciting the junction with a short current pulse of 300 A.

Simulations of Disordered Array Synaptic Networks

The relaxation oscillator structures that define the neuron behavior of the previous section can be connected to other neurons via the following synaptic systems.

Cuprate-Based Synapse Arrays.

The cuprate-based neurons of the previous section can also be useful for producing synaptic behavior in disordered arrays as shown in Fig. 4, highlighting the singular scenario where one device can act as a neuron or synapse. While the neuron structure has a varying threshold and firing behavior that depends on the input magnitude (i.e., the number of input spikes or the total DC current input), a disordered array structure that is decoupled from the input spike timing or rate can exhibit output behavior dependent on the input or feedback spike timing (spike timing−dependent plasticity), the spiking rate (rate-dependent plasticity), and the total magnitude (i.e., the number of vortices or spikes). Furthermore, the large nonvolatile memory of the array comes from the various configurations of supercurrents within different loops. This memory state updates with every change in input or output as vortices enter and leave the array, thereby allowing learning behavior. If each of the loops in the array is restricted to allow at least one vortex in it, then the total number of memory configurations possible can be a maximum of , where is the number of loops. Note that this can be a significantly higher number if the loops can accommodate more than one vortex. However, any spatial symmetry in the pattern of arrays reduces this number. Thus, a disordered array has the most possible number of configurations. The array has input and output terminals (shown in blue in Fig. 4) for spiking signals and feedback terminals (shown in black in Fig. 4). Therefore, it behaves as a collective synaptic network between all the input and output neurons connected to the array. This approach presents an alternative architecture to artificial neural networks that is described in detail in a previous paper (36). In summary, this disordered array approach is defined by the small-world or random fully recurrent neural network as opposed to the largely feed-forward and regular neural network architectural approach popular in artificial neural networks. Consequently, the disordered array approach can be a highly scalable and power-efficient approach (45).
Fig. 4.

Collective synapse with four inputs and four outputs employing a disordered array of loops of different sizes connected through Josephson junctions. Flux quanta can be stored in the loops in the form of circulating supercurrents, with junctions allowing transport of flux between loops; are the input terminals, are the output terminals, and are the feedback terminals coupled to the outputs.

Collective synapse with four inputs and four outputs employing a disordered array of loops of different sizes connected through Josephson junctions. Flux quanta can be stored in the loops in the form of circulating supercurrents, with junctions allowing transport of flux between loops; are the input terminals, are the output terminals, and are the feedback terminals coupled to the outputs. The large disordered array shown in Fig. 4 comprises several relaxation oscillators defined by superconducting loops which are fabricated with focused ion beam tunnel junctions with different natural frequencies. Therefore, such large arrays of coupled oscillators describe a complex system that may be more effectively studied by examining its emergent behavior at a larger scale. To describe this in a fairly simple structure, we consider a three-loop configuration and demonstrate some of its properties that support synaptic behavior. The three-loop structure shown in Fig. 5 is disordered with dissimilar tunnel junctions and an asymmetric geometry. Therefore, the vortex dynamics for vortices entering the array through the terminal labeled “Spiking input” and leaving through the other two loops is nonlinear and highly dependent on input spiking and current biasing (feedback current) conditions. However, the mechanisms defining such dynamics can be understood by considering the current paths between any two terminals.
Fig. 5.

Disordered array with three loops representing a simplified synaptic network used in simulations and analysis. The junctions are all chosen to have different parameters, and the geometry is chosen to distribute inductance asymmetrically in the network. Asymmetry ensures that the number of available memory states is maximized for a given number of loops.

Disordered array with three loops representing a simplified synaptic network used in simulations and analysis. The junctions are all chosen to have different parameters, and the geometry is chosen to distribute inductance asymmetrically in the network. Asymmetry ensures that the number of available memory states is maximized for a given number of loops. The current entering the first loop divides between the two paths until it exceeds the critical current of the smaller junction. After the junction switches, the current diverts to a different path. This process is dynamic, with current paths changing with every switching event in the array. The synaptic weight corresponds to the total current between any two terminals that can be defined as input or output. Analogous to the neuron behavior, the total current and rate of change of current between any two nodes in the neural network can produce corresponding voltage spikes across the junctions at those nodes. Therefore, the synaptic weight is defined as shown in the equation below,The synaptic network of Fig. 5 was resolved into lumped circuit elements, and the resulting simulations of the electrical circuit are shown in Fig. 6. An input spike train of constant frequency is sent to one of the loops, and the output spike trains across the outer junctions in the other two loops are shown in Fig. 6 . The output spiking behavior changes with changing feedback currents and their rates of changes. The synaptic weight can be calculated by measuring the total number of output spikes produced for a given number of input spikes as described by Eq. . The synaptic weight, and therefore the plasticity of a disordered array memory, depends on several parameters such as the input spike frequency and timing, along with output spike frequency and timing through the feedback loop. This behavior is described by simulation results shown in Fig. 6, where the synaptic weight is plotted as a function of one of the control parameters, that is, rate of change of current feedback 1. Different curves, labeled 1 to 7, show resulting weights as the other parameters are varied. Curve 1 shows the case where only current feedback 1 is increased linearly, resulting in the weight saturated at close to one. A different input frequency yields different weight as in curve 2. When the second feedback current is also linearly varied, a different weight can be achieved as in curve 3. During the operation of an actual neural network, all the input and feedback current parameters can be expected to vary nonlinearly, therefore yielding different synaptic weights as shown in curves 4 to 7 (details of the parameters are provided in Fig. 6 legend). Furthermore, all the curves also exhibit stable states in the form of saturated weights that are not affected by further changes in input/feedback parameters. The details of the actual circuit parameters and the currents applied are not relevant to the operation of the synaptic memory. This is because the resulting synaptic weights are similar when relative strengths and frequencies of the input and output signals are identical, irrespective of the actual magnitudes and frequencies of the individual signals. To demonstrate this behavior, the feedback current is varied as a function of input current for 10 different input currents, and the synaptic weights are shown in Fig. 6. Although the actual values of input and feedback currents in the simulations are varied significantly, ranging between a few hundred microamperes to several milliamperes, the resulting synaptic weights are identical when the relative strengths of the signal are the same.
Fig. 6.

Simulation results of the three-loop synapse with one input and two outputs as shown in Fig. 5. Critical currents of the junctions are as follows: J1 (140 A), J2 (110 A), J3 (120 A), J4 (100 A), and J5 (160 A). Loop inductances are as follows: loop 1 (15 pH), loop 2 (45 pH), and loop 3 (37 pH). (A) An input spike train of constant frequency with each spike representing single-flux quantum applied at the input. A linearly increasing bias current is applied at current feedback 1, while the current at the current feedback 2 is zero. (B) Resulting output spike train at output 1. The output spike frequency changes nonlinearly as the feedback current changes. (C) Resulting output spike train at output 2. The output spike train is different from that of output 1. (D) Synaptic weight calculated using Eq. versus the rate of change of feedback current at current feedback 1; 1, only the rate of current feedback 1 is varied for a period of 10 ns; 2, only the rate of current feedback 1 is varied with a different input frequency for a period of 10 ns; 3, both the current feedback currents are varied linearly for a period of 10 ns; 4, the input frequency is varied along with both the current feedback rates for a period of 10 ns; 5, the input and feedback signals are linearly varied for a period of 10 ns; 6, all the signals are varied nonlinearly; and 7, all the signals are varied nonlinearly, but with different input signals from that of 6. (E) Synaptic weight as a function of percentage of feedback current to input current for 10 different feedback and input current values. The input current is varied within the range −10 mA to 10 mA, with 10 different curves representing different current steps of 2 mA.

Simulation results of the three-loop synapse with one input and two outputs as shown in Fig. 5. Critical currents of the junctions are as follows: J1 (140 A), J2 (110 A), J3 (120 A), J4 (100 A), and J5 (160 A). Loop inductances are as follows: loop 1 (15 pH), loop 2 (45 pH), and loop 3 (37 pH). (A) An input spike train of constant frequency with each spike representing single-flux quantum applied at the input. A linearly increasing bias current is applied at current feedback 1, while the current at the current feedback 2 is zero. (B) Resulting output spike train at output 1. The output spike frequency changes nonlinearly as the feedback current changes. (C) Resulting output spike train at output 2. The output spike train is different from that of output 1. (D) Synaptic weight calculated using Eq. versus the rate of change of feedback current at current feedback 1; 1, only the rate of current feedback 1 is varied for a period of 10 ns; 2, only the rate of current feedback 1 is varied with a different input frequency for a period of 10 ns; 3, both the current feedback currents are varied linearly for a period of 10 ns; 4, the input frequency is varied along with both the current feedback rates for a period of 10 ns; 5, the input and feedback signals are linearly varied for a period of 10 ns; 6, all the signals are varied nonlinearly; and 7, all the signals are varied nonlinearly, but with different input signals from that of 6. (E) Synaptic weight as a function of percentage of feedback current to input current for 10 different feedback and input current values. The input current is varied within the range −10 mA to 10 mA, with 10 different curves representing different current steps of 2 mA.

Nickelate-Based Synapse Arrays.

The distribution of doped hydrogen in is governed by diffusion, so the dopant is mostly concentrated in the vicinity ( nm) of the Pt or Pd electrode, through which it was annealed into the film (40). However, it has been shown that the distribution of hydrogen can be changed by applying a voltage bias between different contacts, which can be the spiking voltage pulses created by the YBCO neurons discussed previously. The mechanism behind the redistribution of H dopants in films includes two main processes: drift of the charged ions in the electric field and thermodiffusion of ions in a temperature gradient created by Joule heating (43, 46). However, as soon as the currents flowing through the device are small, the latter effect can be neglected. As a first approximation, one can assume that the dominant mechanism by which ions initially introduced into the nickelate film can later be moved across the film is the electric field created when the voltage is applied to the contacts. The major effect of ion redistribution is a change of the device resistance in a nonvolatile way (8, 9, 47). The resistance of such a device accumulates information on polarity, magnitude, and duration of voltage pulses applied to the device in the past, which makes it a prospective candidate for creation of an artificial synapse (43, 48, 49). The synaptic weight of such an artificial synapse is inversely proportional to its resistance. A schematic of such a lateral memory device is shown in Fig. 7 . The voltage applied to one of the electrodes causes a drift of the implanted between the Pt/Pd and Au electrodes in a two-terminal device. Depending on distribution between the electrodes, the resistance of a device can be significantly changed (43). A series of short electrical pulses applied to such a device can change its resistance, as shown in Fig. 7. By varying the duration and magnitude of the pulses, one can change the rate at which the resistance is updated and, in such a way, approach the learning curves typical for living neural cells (43).
Fig. 7.

(A and B) Distribution of in a -based memory device in low-resistance (A) and high-resistance (B) states. The color key corresponds to local concentration of as well as electrical resistance of the media. Electric field lines between two contacts are indicated with solid black lines. (C) Change of the normalized resistance of the device caused by a series of electrical pulses with a magnitude of 1 mV and a duration of 1 ns.

(A and B) Distribution of in a -based memory device in low-resistance (A) and high-resistance (B) states. The color key corresponds to local concentration of as well as electrical resistance of the media. Electric field lines between two contacts are indicated with solid black lines. (C) Change of the normalized resistance of the device caused by a series of electrical pulses with a magnitude of 1 mV and a duration of 1 ns. The resistance of such a device with any given distribution of hydrogen can be approximately estimated from a two-dimensional (2D) model we describe below. The distribution of the current density can be found from Ohm’s law , where is an electric field, and is the material’s conductivity. The electrical potential can be found by solving the current continuity equationThe conductivity is a function of the local concentration of hydrogen dopants , and can be approximated as (43, 50)where , , and are the free parameters of the model. In this work, we used a 2D Gaussian distribution of hydrogen and fitted the parameters in such a way that, at the point with maximum concentration of dopants, the conductivity is about times smaller than in the pristine nickelate film. Eq. for potential was solved iteratively on a 2D grid with boundary conditions at the edges of the computational grid and at the Au contact, while a voltage mV was applied to the Pd/Pt contact. The current was evaluated as a total current flowing in the Au contact. In the geometry of a device with two contacts shown in Fig. 7, the resistance in Fig. 7 is ∼10 times higher than in Fig. 7. This difference is explained by redistribution of the current flow, as shown in Fig. 7 , and the fact that even the pristine has low conductivity, so there is no short between the contacts. In Fig. 7, the gradual change of the resistance is plotted as a function of voltage pulses applied to the electrodes. In this simulation, we estimated the magnitude of each pulse to be 1 mV with a duration of 1 ns to match the typical values for a YBCO-based neuron described above. At this point, one can see an analogy between a magnetic vortex, entering the superconducting loop of YBCO and changing its total magnetic flux, and a cloud of highly mobile ions drifting between two contacts and changing the total resistance. Another step in the creation of a disordered array of such -based synapses would be overlaying of contact pairs in such a way that redistribution of ions will influence the total resistance between all pairs of contacts. An example of such a situation with two pairs of contacts is shown in Fig. 8 . One can place the center of the distribution at any point between the four contacts. Our computations, conducted under the same assumptions described previously, directly show the current flow between the contacts for various distributions of . The largest changes in the current flow between two contacts occur when the high resistive -rich area is located directly between the contacts. In Fig. 8 , the examples of current flow are shown for three different distributions of , indicating how significantly one can change the flow by introducing the ions between the pair of contacts or directly under the electrode. In these simulations, we assumed that the voltage is applied to only one (lower) electrode, while the three other electrodes are grounded.
Fig. 8.

(A–C) Distribution of in a scheme with two overlaying pairs of contacts. The current distributions are shown with black lines for three different positions of the cloud between the contacts. In all cases, one electrode has a positive potential , and all other electrodes are grounded, . (D–F) Resistance (D), (E), and (F) measured between the corresponding pairs of electrodes shown in A–C. Common logarithm of the ratio is shown, where is the resistance between the corresponding pair of contacts in the absence of doping.

(A–C) Distribution of in a scheme with two overlaying pairs of contacts. The current distributions are shown with black lines for three different positions of the cloud between the contacts. In all cases, one electrode has a positive potential , and all other electrodes are grounded, . (D–F) Resistance (D), (E), and (F) measured between the corresponding pairs of electrodes shown in A–C. Common logarithm of the ratio is shown, where is the resistance between the corresponding pair of contacts in the absence of doping. The resistances , , and between the corresponding pairs of electrodes are shown in Fig. 8 as a function of the position of the center of mass. Each point on these plots corresponds to a certain position of the cloud in the scheme with two overlaying pairs of electrodes shown in Fig. 8 . One can see that the resistance increases significantly when the ions are concentrated not just between two electrodes but especially close to one of them. We should note that, due to a high fourfold symmetry of the contacts, the maps of the resistance are also symmetric ( has a horizontal axis of symmetry , while and can be obtained from each other by flipping around a vertical axis ). Similar maps can be obtained for all other pairs of contacts; in the case of symmetric contacts, they will look exactly like in Fig. 8 , but rotated over or . Overlaying of several pairs of contacts allows changing the resistance of all of them simultaneously by shifting the center of the distribution. For example, for the configuration of electrodes shown in Fig. 8 , decreasing the synaptic weight of the vertical pair of contacts by moving from point m to m will also cause a similar decrease of the synaptic weight for the horizontal pair. At the same time, the corresponding synaptic weight will be increased, since the resistance between contacts 1 and 2 will be decreased. Obviously, the coupling between pairs of contacts can be designed to be significantly more complex by creating nonsymmetric (disordered) arrangements of contacts. This coupling between different devices on a hardware level can improve the performance of a neuromorphic circuit in two ways. First, it creates additional connections between neurons, which can be seen as lateral connections within a single layer of an artificial network or additional feedback connections between neurons in different layers (45, 51). This situation of several neurons being connected to each other via a single nanodevice with several coupled pairs of contacts is similar to the disordered array of superconducting loops described earlier. Second, allowing ions to move not only between a pair of contacts but also in orthogonal directions (so that the center of distribution can be placed anywhere within a 2D device), one can create new memory states. Each memory state corresponds to a certain distribution of the and leads to a certain set of synaptic weights. Some of the configurations may be degenerate due to any symmetry in electrode positions (as in Fig. 8). However, this degeneracy can be removed by creating a disordered arrangement of contacts, similar to the contacts created for the superconducting loops discussed in the previous section. In general, there are many other strongly correlated materials that display resistive properties highly sensitive to the valency of the d-metal ion (10). For instance, vanadium oxides have recently emerged as key materials for designing neuron-like resistors (52). Manganites (53) and cobaltates (54) also display properties that could potentially be tuned via light-ion doping, which could then be turned into random networks. Furthermore, recent work on 5d iridates with strong spin−orbit coupling have displayed resistive switching (55), which could be used in spin-torque oscillators that also provide neuromorphic functionalities (56).

Outlook and Conclusions

A key aspect of any hardware neural network design is the connectivity between elements. The cuprate-based loop-junction devices we describe are submicron scale and can easily be fabricated on thin films, allowing for a vast array of them. Importantly, the unique aspect of utilizing similar fluxon dynamics with simple geometrical variations of superconducting loop structures to achieve both neuron and synapse operation offers a new kind of flexibility. For instance, the storage of flux quanta in superconducting loops with Josephson junctions brings about aspects such as plasticity that is required for synapses, while the oscillatory properties of these loops in response to spiking inputs resemble properties that are needed in neurons. Furthermore, the two material platforms we present could be connected by growing an island of thin film of nickelate over the cuprate film with all the loops already patterned. Since the hydrogen is not known to affect the properties of the cuprate films, the whole device can then be implanted with hydrogen, which only affects the nickelate overlayer. Then, the two films can be wire bonded through many possible entry paths across the networks, allowing for highly flexible and complex network design patterns. Combining the two systems opens up a particularly interesting possibility to realize a key aspect of biological brains: the ability to operate memories that span different time scales. While cuprate synapses exhibit dynamic and volatile memory with ease in “learning” and “forgetting,” the nickelate synapses exhibit long-term nonvolatile memory. Therefore, the voltage spikes produced in the cuprate system can interact with the nickelate synapse arrays to update their long-term memory configurations. In summary, our proposed arrays of devices can be implemented into networks with flexible connectivity platforms. In conclusion, we present simulations of artificial neural network components combining superconductivity and metal-insulator transitions in complex oxides, two of the most spectacular examples of emergent physics in condensed matter. By creating electronic and structural disorder induced by light ions, one can design individual devices that mimic neurons and synapses in the brain. These devices can easily be combined into coupled networks using well-established lithography methods, and then the number of response states increases, making them potential candidates for neuromorphic cryoelectronics. Moreover, our key finding demonstrates that a randomly spaced network of devices shows an exponential number of collective states that begins to mimic the emergent behavior of features found in living organisms. We emphasize that the use of light ions to modify the electronic response properties of oxides can be applied to several families of strongly correlated materials, thus paving the way for a multitude of studies that can be performed in the search for a new computational paradigm.
  21 in total

1.  A correlated nickelate synaptic transistor.

Authors:  Jian Shi; Sieu D Ha; You Zhou; Frank Schoofs; Shriram Ramanathan
Journal:  Nat Commun       Date:  2013       Impact factor: 14.919

2.  Strongly correlated perovskite fuel cells.

Authors:  You Zhou; Xiaofei Guan; Hua Zhou; Koushik Ramadoss; Suhare Adam; Huajun Liu; Sungsik Lee; Jian Shi; Masaru Tsuchiya; Dillon D Fong; Shriram Ramanathan
Journal:  Nature       Date:  2016-05-16       Impact factor: 49.962

3.  Collective dynamics of 'small-world' networks.

Authors:  D J Watts; S H Strogatz
Journal:  Nature       Date:  1998-06-04       Impact factor: 49.962

4.  Nano Josephson superconducting tunnel junctions in YBa2Cu3O(7-δ) directly patterned with a focused helium ion beam.

Authors:  Shane A Cybart; E Y Cho; T J Wong; Björn H Wehlin; Meng K Ma; Chuong Huynh; R C Dynes
Journal:  Nat Nanotechnol       Date:  2015-04-27       Impact factor: 39.213

5.  From quantum matter to high-temperature superconductivity in copper oxides.

Authors:  B Keimer; S A Kivelson; M R Norman; S Uchida; J Zaanen
Journal:  Nature       Date:  2015-02-12       Impact factor: 49.962

6.  Synchronization dynamics on the picosecond time scale in coupled Josephson junction neurons.

Authors:  K Segall; M LeGro; S Kaplan; O Svitelskiy; S Khadka; P Crotty; D Schult
Journal:  Phys Rev E       Date:  2017-03-22       Impact factor: 2.529

7.  Habituation based synaptic plasticity and organismic learning in a quantum perovskite.

Authors:  Fan Zuo; Priyadarshini Panda; Michele Kotiuga; Jiarui Li; Mingu Kang; Claudio Mazzoli; Hua Zhou; Andi Barbour; Stuart Wilkins; Badri Narayanan; Mathew Cherukara; Zhen Zhang; Subramanian K R S Sankaranarayanan; Riccardo Comin; Karin M Rabe; Kaushik Roy; Shriram Ramanathan
Journal:  Nat Commun       Date:  2017-08-14       Impact factor: 14.919

8.  Perovskite neural trees.

Authors:  Hai-Tian Zhang; Tae Joon Park; Ivan A Zaluzhnyy; Qi Wang; Shakti Nagnath Wadekar; Sukriti Manna; Robert Andrawis; Peter O Sprau; Yifei Sun; Zhen Zhang; Chengzi Huang; Hua Zhou; Zhan Zhang; Badri Narayanan; Gopalakrishnan Srinivasan; Nelson Hua; Evgeny Nazaretski; Xiaojing Huang; Hanfei Yan; Mingyuan Ge; Yong S Chu; Mathew J Cherukara; Martin V Holt; Muthu Krishnamurthy; Oleg G Shpyrko; Subramanian K R S Sankaranarayanan; Alex Frano; Kaushik Roy; Shriram Ramanathan
Journal:  Nat Commun       Date:  2020-05-07       Impact factor: 14.919

9.  Design of a Power Efficient Artificial Neuron Using Superconducting Nanowires.

Authors:  Emily Toomey; Ken Segall; Karl K Berggren
Journal:  Front Neurosci       Date:  2019-09-04       Impact factor: 4.677

10.  Physical electro-thermal model of resistive switching in bi-layered resistance-change memory.

Authors:  Sungho Kim; Sae-Jin Kim; Kyung Min Kim; Seung Ryul Lee; Man Chang; Eunju Cho; Young-Bae Kim; Chang Jung Kim; U -In Chung; In-Kyeong Yoo
Journal:  Sci Rep       Date:  2013       Impact factor: 4.379

View more
  2 in total

1.  Superconducting disordered neural networks for neuromorphic processing with fluxons.

Authors:  Uday S Goteti; Han Cai; Jay C LeFebvre; Shane A Cybart; Robert C Dynes
Journal:  Sci Adv       Date:  2022-04-22       Impact factor: 14.957

2.  A distributed nanocluster based multi-agent evolutionary network.

Authors:  Liying Xu; Jiadi Zhu; Bing Chen; Zhen Yang; Keqin Liu; Bingjie Dang; Teng Zhang; Yuchao Yang; Ru Huang
Journal:  Nat Commun       Date:  2022-08-10       Impact factor: 17.694

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.