Optogenetics shines light on Transition Metal Dichalcogenide-based Artificial Neural Networks

Can we come up with a general strategy to increase the bit precision in memristive devices? Here we demonstrate a universal approach to extract high precision of weights in transition metal dichalcogenide artificial neural elements via a combination of light and electrical pulses.
Optogenetics shines light on Transition Metal Dichalcogenide-based Artificial Neural Networks

The success of deep learning in applications such as image classification and natural language processing (NLP) has spurred a renewed interest in the area of artificial intelligence. As a first wave of neuromorphic hardware solutions, memristive devices have emerged as promising alternatives to traditional von-Neumann computing architectures for vector matrix-multiplication with advantages in scalability, cost, and power consumption. However, most memristor demonstrations have limited precision weights and hence, can only be used for shallow feed-forward networks. These are incapable of addressing applications like speech recognition and NLP that require learning of temporal signals, recurrent connections and deeper architectures (> 10 hidden layers) as in deep recurrent neural networks (DRNNs). Hence, we need neuromorphic devices with high precision weights to advance beyond simple pattern matching to complex cognitive tasks such as speech recognition.

Figure 1. Photo-modulated a Spike Timing Dependent Plasticity (STDP) learning rules and b integrate-and-fire behaviour in PENs. c Spatiotemporally-selective perturbation of PENs.

Inspired from optogenetics-a photo-stimulated neuromodulation technique that utilizes optical pulses to control biological neurons, we demonstrate transition metal dichalcogenide (TMDC) photo-excitable neuristors (PENs) with high weight precision as a second wave of neuromorphic devices for advanced in-memory computing. Utilizing a combination of optical and electrical inputs, we are able to probe computationally-relevant properties such as spike-timing-dependent plasticity (STDP) in our synaptic transistors and realize integrate and fire (I&F) neuron circuits with high modulability and spatiotemporal selectivity (Figure 1). Most importantly, by carefully controlling the gating parameters of our PENs such as the gate voltage, the initial conductance state, the drain voltage and intensity of light illumination, we are able to extract excellent conductance linearity from these devices with high bit precision (~10-bit), high signal-to-noise ratio (as high as 77) and low write noise. The proposed PEN features an order of magnitude higher linear dynamic range (LDR) than other recent state-of-the-art reports, enabling us to simulate a DRNN for speech recognition with an order of magnitude higher parameters than digit recognition networks (Figure 2).

Figure 2. a Optically addressable multi-level memory for DRNNs with optical potentiation and electrical depression. b Simulation of a DRNN for keyword recognition.

In summary, the concept of utilizing blind linear photo-updates to selectively manipulate artificial neural circuitry is the first of its kind and represents a significant advance. The proposed optoelectronic architecture can be generalized to a wide variety of semiconducting platforms (i.e., III-V semiconductors, TMDCs, halide perovskites and organic semiconductors), and wavelength division multiplexing schemes, opening up new possibilities with improved scalability and CMOS compatibility. In short, we believe our work paves the way for new memristive devices with high bit precision that can exploit both optical and electrical inputs for advanced cognitive tasks, making this report of interest to wide scientific audiences in materials science, computing, applied physics and electrical engineering.


Please check out our recent work published on Nature Communications: “Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks” at the link: 10.1038/s41467-020-16985-0.

Please sign in or register for FREE

If you are a registered user on Nature Portfolio Engineering Community, please sign in