Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Neural Gradient Evaluation through Nanodevice Dynamics

Periodic Reporting for period 2 - Grenadyn (Neural Gradient Evaluation through Nanodevice Dynamics)

Reporting period: 2024-04-01 to 2025-09-30

The electricity demands of artificial intelligence (AI) prevent training systems on embedded hardware for edge applications, and causes environmental concerns when used in data centers. Neuromorphic computing takes inspiration from the brain to build low energy hardware for AI. Energy savings by factors from 100 to 1000 can be obtained by incorporating emerging nanodevices as synapses and neurons within standard electronic systems. The issue is that such devices display extremely complex dynamics, and are very noisy. The Grenadyn project proposes to exploit these rich dynamics as a principle for computing, with neural trajectories determining synaptic evolution. The project also looks into how biological synapses deal with inherent noise in order to achieve high accuracy at pattern recognition and classification despite nanodevice imperfections.
The goal of Grenadyn is develop nanodevice-based dynamical neural networks that self-learn through their physics. The project is divided in three workpackages. WP1: Creating a self-learning neural network using the dynamics of nanodevices, WP2: Building a self-learning system resilient to nanodevice imperfections, WP3: Extending the functionalities of self-learning dynamical systems to different types of nanodevices
In this first period, we have five major achievements:

1) Dongshu Liu, Jérémie Laydevant, Adrien Pontlevy, Damien Querlioz, Julie Grollier, "Unsupervised End-to-End Training with a Self-Defined Target," Neuromorphic Computing and Engineering 4(4), 044005 (2024) arXiv:2403.12116
We demonstrate that a hardware neural network originally designed for supervised learning can also learn in an unsupervised or semi-supervised manner, simply by adding a minimal circuit at its output. This approach is highly promising for edge computing applications, where devices can be trained directly on data sensed from their environment, without the need for labeled examples.

2) Jérémie Laydevant, Danijela Marković, Julie Grollier, "Training an Ising Machine with Equilibrium Propagation," Nature Communications 15, 3671 (2024) arXiv:2305.18321
We show that a hardware Ising machine based on emerging nanodevices—specifically the Josephson junctions used in D-Wave circuits—can be trained using its own physical dynamics. Weight updates are determined by comparing spin values after different phases of energy minimization. This work demonstrates that relatively large-scale systems can learn directly from their physics, opening the door to training Ising machines, which were previously used primarily for solving combinatorial problems with fixed weights.

3) Tristan da Câmara Santa Clara Gomes, Yanis Sassi, Dédalo Sanz-Hernández, Sachin Krishnia, Sophie Collin, Marie-Blandine Martin, Pierre Seneor, Vincent Cros, Julie Grollier, Nicolas Reyren, "Neuromorphic Weighted Sums with Magnetic Skyrmions," Nature Electronics 8, 204 (2025) arXiv:2310.16909
We present a skyrmionic synapse based on skyrmion nucleation that can perform weighted sum operations—fundamental to all neural network computations—in a biologically inspired manner. Our results show that this approach is scalable and well-suited for self-learning systems, offering a promising path for future neuromorphic hardware.

4) Xing Chen, Dongshu Liu, Jérémie Laydevant, Julie Grollier, "Self-Contrastive Forward-Forward Algorithm," Nature Communications 16, 5978 (2025) arXiv:2409.11593
We show that the Forward-Forward algorithm can achieve state-of-the-art accuracy on unsupervised learning tasks despite its simplicity. This success relies on an elegant, flexible method for generating the “negative examples” required to train such networks—enabling efficient, label-free learning.

5) Théophile Rageau, Julie Grollier, "Training and Synchronizing Oscillator Networks with Equilibrium Propagation," Neuromorphic Computing and Engineering (2025)
arXiv:2504.11884
We show that Equilibrium Propagation can be applied to coupled oscillator networks, enabling both synchronization and learning to recognize images. This opens the path to training hardware neural networks composed of oscillators with intrinsic frequency dispersion.
We will assemble memristive as well as spintronic nanocomponents in neural networks that perform pattern recognition through Equilibrium Propagation. We will show that these dynamical networks learn by nudging their outputs towards the desired solution through a spring-like force, and letting nano-synapses and neurons reorganize themselves towards equilibrium. We will show that they can also learn directly from the data, without supervision.
We will induce a high resilience to imperfections in these networks through self-adaptation and digitization. We will demonstrate by experiments and simulations that our physical neural networks made of variable elements compute with an accuracy similar to software neural networks trained with backpropagation. We will produce a chip integrating nanosynaptic devices on CMOS and achieve state-of-the-art recognition rates on AI image benchmarks.
We will enhance the network functionalities by leveraging their dynamical properties through synchronization and time-delayed feedback. Finally, we extend Grenadyn’s in-materio self-learning to any assembly of coupled dynamical nanodevices, providing novel horizons for multifunctional materials and devices.
Grenadyn’s scientific advances in condensed-matter physics, non-linear dynamics, electronics, and AI will give the foundations for deep network chips that contain billions of nano-synapses and nano-neurons and self-learn with state-of-the-art accuracy.
My booklet 0 0