Periodic Reporting for period 4 - NANOINFER (Intelligent Memories that Perform Inference with the Physics of Nanodevices)
Okres sprawozdawczy: 2021-09-01 do 2023-08-31
As the intelligent memories developed within NANOINFER are very low power, their use will be possible directly within smart devices, therefore reducing their reliance on data centers. This can bring massive energy savings, avoiding all the energy costs associated with the communication between smart devices and data centers. A major application is systems that have to process sensory-motor information, and all systems that have to fuse information coming from diverse sensors and/or prior information (e.g. smart sensors and biomedical chips), or vehicle driver assistance or automation. The possibility of learning within intelligent memory can also provide extreme adaptability to smart devices.
Second, we designed complete natively-intelligent memories associating nanodevices and conventional transistors. This design work involved the codesign of adapted machine learning architectures, circuits and systems, while simultaneously testing the appropriateness of the ideas on real arrays of nanodevices. We focused on three different concepts:
- Bayesian machines that perform Bayesian inference
- binarized neural networks, which are specially adapted to natively intelligent memories
- and the Bayesian version of neural networks, which provides uncertainty quantification.
These designs achieve outstanding energy efficiency through three mechanisms. They collocate logic and memory and require extremely minimal data movement within the system. They rely on simplified arithmetic. Finally, they exploit nanodevices in adapted regimes, where they are not deterministic. Using a hybrid CMOS/memristor technology, we fabricated multiple demonstrators of natively intelligent memories, incorporating between 1,024 and 16,384 memristors, which demonstrated the viability of the natively intelligent memory concepts and provided high visibility to the project. The demonstrators can be programmed with proof-of-concept applications, e.g. gesture recognition, sleep cycle classification, or arrhythmia classification.
We have also proposed several approaches to provide these memories with learning features, each exploiting the device physics in a different yet highly efficient manner:
- hybrid analog/digital training of binarized networks, which exploits the analog physics of memory devices, but requires solely digital CMOS circuitry
- Bayesian learning, which exploits the intrinsic stochastic effects of memory devices to implement a sampling operation
- Equilibrium Propagation, which incorporates the physics of devices and circuits into the learning process.
The project results were disseminated in 19 peer-reviewed journal publications (including 8 Nature X publications) and 18 international conference proceedings (including A+ conferences in several fields: IEDM, ESSCIRC, NeurIPS, and DATE). They were also presented in 49 invited and keynote presentations at international conferences, schools, and workshops. The results led to several more applied follow-up projects. Our Bayesian machine demonstrator has attracted significant interest from the press and was covered, e.g. by Nature, IEEE Spectrum, and the French edition of Scientific American. The concept of natively intelligent memory, closer to brain memory than conventional computer memory, was presented in several outreach events.
Second, we designed, fabricated, and tested several pioneering hybrid CMOS/nano natively intelligent memory systems. Among these, our two Bayesian machines stand out as the first memristor-based systems fabricated specifically for Bayesian inference. These machines have demonstrated proficiency in tasks such as gesture and sleep cycle recognition. Another major achievement is our memristor-based Bayesian neural network that leverages memristor imperfections to model probability distributions, showcasing its potential in applications like arrhythmia recognition with integrated uncertainty assessment.
Finally, we explored various learning methodologies and their applicability to our designs. A remarkable achievement in this domain is the successful experimental training of 16,384 memristors for cancerous mammogram recognition using a Bayesian learning approach that capitalizes on memristor imperfections. This accomplishment marks the largest "nanoBayesian" experiment conducted to date. Additionally, we have evidenced that equilibrium propagation, a learning algorithm inspired by physical principles and particularly suited for natively intelligent memories, can be effectively scaled to tackle complex problems.