Skip to main content
Weiter zur Homepage der Europäischen Kommission (öffnet in neuem Fenster)
Deutsch Deutsch
CORDIS - Forschungsergebnisse der EU
CORDIS

Neural Gradient Evaluation through Nanodevice Dynamics

Periodic Reporting for period 1 - Grenadyn (Neural Gradient Evaluation through Nanodevice Dynamics)

Berichtszeitraum: 2022-10-01 bis 2024-03-31

The electricity demands of artificial intelligence (AI) prevent training systems on embedded hardware for edge applications, and causes environmental concerns when used in data centers. Neuromorphic computing takes inspiration from the brain to build low energy hardware for AI. Energy savings by factors from 100 to 1000 can be obtained by incorporating emerging nanodevices as synapses and neurons within standard electronic systems. The issue is that such devices display extremely complex dynamics, and are very noisy. The Grenadyn project proposes to exploit these rich dynamics as a principle for computing, with neural trajectories determining synaptic evolution. The project also looks into how biological synapses deal with inherent noise in order to achieve high accuracy at pattern recognition and classification despite nanodevice imperfections.
The goal of Grenadyn is develop nanodevice-based dynamical neural networks that self-learn through their physics. The project is divided in three workpackages. WP1: Creating a self-learning neural network using the dynamics of nanodevices, WP2: Building a self-learning system resilient to nanodevice imperfections, WP3: Extending the functionalities of self-learning dynamical systems to different types of nanodevices
In this first period, we have two major achievements:
1) We have demsontrated that ahardware system, the D’Wave Ising machine, composed of thousands of interconnected spins qubits, can be trained through the intrinsic dynamics of its superconductcomponents to recognize hand-written digits with state-of-the-art accuracy:
Jérémie Laydevant, Danijela Marković & Julie Grollier, Training an Ising machine with equilibrium propagation, Nature Communications 15, 3671 (2024), https://arxiv.org/abs/2305.18321(öffnet in neuem Fenster)
2- We have devised an algorithm compatible with such dynamical systems, that allows to easily switch from supervised to unsupervised learning with the exact same hardware – with minor changes to the output layer. This milestone, obtained in advance, (M1.5 due on month 24) will allow the systems that we develop to make sense of labelled and unlabelled data – an important feature for edge AI. The paper is under review:
Dongshu Liu, Jérémie Laydevant, Adrien Pontlevy, Damien Querlioz, Julie Grollier, Unsupervised End-to-End Training with a Self-Defined Target, https://arxiv.org/abs/2403.12116(öffnet in neuem Fenster)
We will assemble memristive as well as spintronic nanocomponents in neural networks that perform pattern recognition through Equilibrium Propagation. We will show that these dynamical networks learn by nudging their outputs towards the desired solution through a spring-like force, and letting nano-synapses and neurons reorganize themselves towards equilibrium. We will show that they can also learn directly from the data, without supervision.
We will induce a high resilience to imperfections in these networks through self-adaptation and digitization. We will demonstrate by experiments and simulations that our physical neural networks made of variable elements compute with an accuracy similar to software neural networks trained with backpropagation. We will produce a chip integrating nanosynaptic devices on CMOS and achieve state-of-the-art recognition rates on AI image benchmarks.
We will enhance the network functionalities by leveraging their dynamical properties through synchronization and time-delayed feedback. Finally, we extend Grenadyn’s in-materio self-learning to any assembly of coupled dynamical nanodevices, providing novel horizons for multifunctional materials and devices.
Grenadyn’s scientific advances in condensed-matter physics, non-linear dynamics, electronics, and AI will give the foundations for deep network chips that contain billions of nano-synapses and nano-neurons and self-learn with state-of-the-art accuracy.
Mein Booklet 0 0