Periodic Reporting for period 2 - Grenadyn (Neural Gradient Evaluation through Nanodevice Dynamics)
Berichtszeitraum: 2024-04-01 bis 2025-09-30
In this first period, we have five major achievements:
1) Dongshu Liu, Jérémie Laydevant, Adrien Pontlevy, Damien Querlioz, Julie Grollier, "Unsupervised End-to-End Training with a Self-Defined Target," Neuromorphic Computing and Engineering 4(4), 044005 (2024) arXiv:2403.12116
We demonstrate that a hardware neural network originally designed for supervised learning can also learn in an unsupervised or semi-supervised manner, simply by adding a minimal circuit at its output. This approach is highly promising for edge computing applications, where devices can be trained directly on data sensed from their environment, without the need for labeled examples.
2) Jérémie Laydevant, Danijela Marković, Julie Grollier, "Training an Ising Machine with Equilibrium Propagation," Nature Communications 15, 3671 (2024) arXiv:2305.18321
We show that a hardware Ising machine based on emerging nanodevices—specifically the Josephson junctions used in D-Wave circuits—can be trained using its own physical dynamics. Weight updates are determined by comparing spin values after different phases of energy minimization. This work demonstrates that relatively large-scale systems can learn directly from their physics, opening the door to training Ising machines, which were previously used primarily for solving combinatorial problems with fixed weights.
3) Tristan da Câmara Santa Clara Gomes, Yanis Sassi, Dédalo Sanz-Hernández, Sachin Krishnia, Sophie Collin, Marie-Blandine Martin, Pierre Seneor, Vincent Cros, Julie Grollier, Nicolas Reyren, "Neuromorphic Weighted Sums with Magnetic Skyrmions," Nature Electronics 8, 204 (2025) arXiv:2310.16909
We present a skyrmionic synapse based on skyrmion nucleation that can perform weighted sum operations—fundamental to all neural network computations—in a biologically inspired manner. Our results show that this approach is scalable and well-suited for self-learning systems, offering a promising path for future neuromorphic hardware.
4) Xing Chen, Dongshu Liu, Jérémie Laydevant, Julie Grollier, "Self-Contrastive Forward-Forward Algorithm," Nature Communications 16, 5978 (2025) arXiv:2409.11593
We show that the Forward-Forward algorithm can achieve state-of-the-art accuracy on unsupervised learning tasks despite its simplicity. This success relies on an elegant, flexible method for generating the “negative examples” required to train such networks—enabling efficient, label-free learning.
5) Théophile Rageau, Julie Grollier, "Training and Synchronizing Oscillator Networks with Equilibrium Propagation," Neuromorphic Computing and Engineering (2025)
arXiv:2504.11884
We show that Equilibrium Propagation can be applied to coupled oscillator networks, enabling both synchronization and learning to recognize images. This opens the path to training hardware neural networks composed of oscillators with intrinsic frequency dispersion.
We will induce a high resilience to imperfections in these networks through self-adaptation and digitization. We will demonstrate by experiments and simulations that our physical neural networks made of variable elements compute with an accuracy similar to software neural networks trained with backpropagation. We will produce a chip integrating nanosynaptic devices on CMOS and achieve state-of-the-art recognition rates on AI image benchmarks.
We will enhance the network functionalities by leveraging their dynamical properties through synchronization and time-delayed feedback. Finally, we extend Grenadyn’s in-materio self-learning to any assembly of coupled dynamical nanodevices, providing novel horizons for multifunctional materials and devices.
Grenadyn’s scientific advances in condensed-matter physics, non-linear dynamics, electronics, and AI will give the foundations for deep network chips that contain billions of nano-synapses and nano-neurons and self-learn with state-of-the-art accuracy.