Periodic Reporting for period 3 - k-NET (k-space Neural computation with magnEtic exciTations)
Reporting period: 2023-01-01 to 2024-06-30
To date, the prevailing methodologies for implementing deep Neural Networks (NN) have predominantly relied on digital architectures, utilizing microprocessors (digital-NN) for computational emulation. Although exploratory research into analog hardware implementations—including memristive systems, integrated photonics, and spin-torque oscillators—has shown promise, these approaches remain constrained by the complexities of nanoscale integration. The most significant among these challenges is the connectivity bottleneck, arising from the exponential growth in interconnects necessary to replicate synaptic densities. Additional hurdles include signal degradation, thermal dissipation and limitations in fabrication scalability. Overcoming these obstacles requires to develop advanced architectures that enhance interconnectivity and computational efficiency while aligning with the constraints of current and future nanofabrication technologies.
With k-NET, we introduce a novel neural network architecture fundamentally distinct from existing models. In contrast to traditional frameworks where neurons and their interconnections (synapses) are configured in real space, k-NET leverages the reciprocal or k-space of high dimensionality. Here, the nonlinear excitation spectrum of, for example, a magnetic microstructure, is used to define and structure neural elements and their interactions. This paradigm has the potential to enable the encoding of intricate interconnectivity topologies within a compact, scalable framework, effectively circumventing the physical limitations inherent to real-space architectures. The k-NET approach facilitates dynamic reconfigurability and topological optimization, significantly enhancing the system's adaptability and resilience to complex computational tasks. As a result, k-NET opens the path for substantial improvements in energy efficiency while establishing itself as a scalable and robust architecture capable of addressing the computational demands of next-generation AI applications.
Theoretical Advancements: A computational tool was created to analyze nonlinear magnetization dynamics in reciprocal space. This tool predicts mode interactions in nonlinear regimes by evaluating coupling coefficients and supports the modeling of neural computations in k-space. Simulations validated theoretical predictions and aided in magnetic mode labeling.
Experimental Progress: Ultra-low damping garnet films were integrated into radiofrequency devices to study SW modes. Nonlinear regimes were achieved using harmonic and parametric excitations, with mode mapping enabling selective population of specific modes via time-gated RF signals.
A transverse parametric pumping scheme facilitated the generation of non-degenerate parametric processes. Key outcomes included: (i) single modes being addressed by multiple independent pumping frequencies, allowing for controlled manipulation of nonlinear properties and demonstrating behaviors like non-commutativity and bi-stability; (ii) selective excitation of mode pairs with single frequencies, establishing 1-to-1 connections resembling artificial synapses.
Parametric excitation of single modes using parallel pumping with RF fields near twice their natural frequencies was modeled using the Landau-Lifshitz-Gilbert equation. Analytical solutions for excitation thresholds and steady-state amplitudes were derived, highlighting dependencies on RF field amplitude and frequency. Coupled-mode equations for two-mode excitation confirmed nonlinear coupling dynamics through experimental validation.
Characterization of the k-NET device demonstrated the effective excitation of multiple modes via frequency multiplexing. Double-tone spectroscopy identified nonlinear coupling and non-commutative behaviors influenced by pulse sequencing. The device functioned as a recurrent neural network (RNN).
To realize this vision, the consortium successfully achieved four key objectives:
1. Creation of Neurons: Defined and engineered the necessary physical properties of the magnetic system to establish effective SW modes as neurons.
2. Control of Synapses: Achieved precise control over the nonlinear interactions of SWs through external stimuli, enabling dynamic tuning of synaptic connections.
3. System Training: Developed robust protocols to train the neural network, ensuring it performed the desired computational functions.
4. Technology Validation: Constructed a working proof-of-concept device that is all-electrical, highly integrated, and suitable for conceptual task testing.
Unlike conventional neural networks, the primary challenge in k-NET was not the construction of the physical system or its interconnections, but the precise addressing and control of the ensemble of SW modes. This approach proved highly flexible, as the programmed functions (i.e. synaptic weights) were encoded through the temporal and frequency modulation of RF control signals, rendering the process hardware-independent. This methodology simplifies fabrication, minimizes device-to-device variations—a common challenge in nano-electronic circuits—and enables a single magnetic element to perform multiple functions.
The successful realization of k-NET marks a significant advancement in neuromorphic computing (NC). It establishes the foundation for k-space-based neuromorphic computing (k-NC), a field that had not been envisioned prior to this project. Real-life applications for autonomous, integrated, CMOS-compatible, and highly efficient systems can now be envisioned, paving the way for innovative technological advancements.