One of the ultimate goals of electrical and computer engineers is to devise machines that can learn and solve problems like human beings. To date, this goal is already achievable to a certain extent, which is reflected by the fact that artificial intelligence (AI) has started to influence our lives. The recent advance in the field of AI mainly relies on the performance boost of the computing hardware. However, the rapid increase in information volume in the big-data era calls for changes to information processing paradigms, which in turn, demands new circuit-building blocks to overcome decreasing cost-effectiveness of transistor scaling and the inefficiency of using transistors in non-von Neumann computing architectures.
One such new circuit element is the resistive switch. These devices manifest tunable resistance owing to underlying mechanisms such as redox reactions, amorphous-crystalline phase transitions, tunnel magnetoresistance, and ferroelectric-polarization (including both tunnel resistance of ferroelectric tunnel junctions and resistance of ferroelectric domain walls).
Although the resistance changes originate due to different underlying physics, the resistive switches share common electrical properties, such as the representation capability, switching speed and energy, reliability and device density. This review also compares the pros and cons of each type of resistive switching material.
These electrical properties make resistive switches suitable for applications in both neuromorphic computing and machine learning. The switching dynamics of resistive switches may bear similarities with synaptic and neural dynamics, leading to resistive switch based artificial synapses and neurons. Such synapses and neurons constitute the spiking neural networks that make use of hardware coded local learning rules. In addition, the resistive switches serve as hardware accelerators for machine learning. They store the weights of fully connected, convolutional, and recurrent artificial neural networks and compute the weighted sums, for supervised, unsupervised and reinforcement learning tasks.
In addition, resistive switches are used for general purpose memcomputing, including analog mem-computing where they accelerate the vector-matrix multiplications like how they do for the machine learning, which benefits data reduction, linear system and
eigenvector solvers, and combinatorial optimization. For digital mem-computing, binary resistive switches can implement cascaded logic operations, while merging logic gates and registers.
Resistive switches could also be utilized for security applications, where they are used to construct physical unclonable functions and random number generators, by exploiting the intrinsic stochasticity of the resistive switching processes.
This review ends with a summary of the challenges in achieving high-performance information processing as well as our perspective on the future research directions for materials engineering, device optimization, system integration and algorithm design.