An Information Theory Inspired Study of Memristor Devices and their Potential Use in Neuromorphic Circuits

Find Similar History 33 Claim Ownership Request Data Change Add Favourite

Title
An Information Theory Inspired Study of Memristor Devices and their Potential Use in Neuromorphic Circuits

CoPED ID
c9ee4ed9-0c33-40b2-a35c-4932f32a57ee

Status
Active

Funders

Value
No funds listed.

Start Date
Sept. 30, 2019

End Date
June 29, 2023

Description

More Like This


This proposed project is aligned with the EPSRC Engineering theme and is best categorised under the Artificial Intelligence Technologies Research Area. It aims to explore, from a communications and information-theoretic perspective, the potential use of memristor devices as synaptic circuit elements in neuromorphic circuits, inspired by Friston's Free Energy Principle to explain the function of the brain [1]. I would like to explore the potential to use devices called memristors (a class of devices whose resistance can be modulated using an applied voltage. They were formalised as a new two-terminal circuit element by Leon Chua in 1971 [2], to complete the set of 4 ideal passive circuit elements) to practically implement the processes described by the theory, through simulation and, if possible, through practical demonstration. Some forms of memristor are good candidates for mimicking the function of binary activation units with a nonlinear thresholding (such as neurons in the human brain) for use in novel neural architectures.

I have previously explored the potential use of memristors as storage devices, modelling them as communication channels using a Generative Adversarial Network (GAN), and using an Autoencoder architecture to compress and transmit data over the noisy memristor channel: a technique known as Deep Joint Source-Channel Coding. Such techniques from deep learning can be extended throughout the course of the project in order to model the devices and their non-idealities, such as imperfect values after programming or resistance drift over time.

The idea is to create a neuromorphic architecture that uses control and optimisation rules to learn. These rules will come from the dynamics of a physical system rather than from programmed rules and will be implemented through negative feedback of an error function in an electronic circuit. This method of optimisation requires no explicit gradient computation, in contrast to the explicit computation of the gradient of the parameters with respect to a loss function, as the overwhelming majority of current neural network architectures in the field of Machine Learning and Deep Learning perform through the algorithm of back propagation.

Current memristive neural networks have attempted to translate the algorithm of back propagation into hardware - using Ohm's law for multiplication and Kirchhoff's current law for addition. They do however demonstrate another advantage: a reduction in power consumption and increase in speed. Many machine learning algorithms run on GPU hardware, using an external memory. For memristive neurons, processing and storage (of weights of the network) are separate from one another. Processing and memory are associated due to a fundamental shift in architecture: memristors have their own storage in the form of the resistance value that they retain after they have been controlled by a voltage or a current. This means that processing and storage are both done in a so called "in-situ" fashion - both in the same location. This is a move from the traditional Von-Neumann (separate storage and processor) architecture of computers towards architectures that function more similarly to the biological, spiking networks found in the human brain. This reduces power and time expended in transferring data between a processing unit and a storage device.
[1] Friston, K. (2010). The free-energy principle: A unified brain theory? Nature Reviews Neuroscience, Vol. 11, pp. 127-138.
[2] L. Chua, "Memristor-The missing circuit element," IEEE Trans. Circuit Theory, vol. 18, no. 5, pp. 507-519, 1971.

Deniz Gunduz SUPER_PER
Waleed El-Geresy STUDENT_PER

Subjects by relevance
  1. Machine learning
  2. Artificial intelligence
  3. Neural networks (information technology)
  4. Optimisation
  5. Data communications networks
  6. Memristors
  7. Architecture
  8. Algorithms
  9. Brain
  10. Computers

Extracted key phrases
  1. Information theory
  2. Current neural network architecture
  3. Unified brain theory
  4. Potential use
  5. Ideal passive circuit element
  6. Memristor device
  7. Synaptic circuit element
  8. Storage device
  9. Terminal circuit element
  10. Current memristive neural network
  11. Novel neural architecture
  12. Neuromorphic circuit
  13. Neuromorphic architecture
  14. Noisy memristor channel
  15. Artificial Intelligence Technologies Research Area

Related Pages

UKRI project entry

UK Project Locations