Periodic Reporting for period 3 - NeuraViPeR (Neural Active Visual Prosthetics for Restoring Function)
Período documentado: 2023-09-01 hasta 2025-02-28
In the project, we developed innovative approaches for stimulation with high-electrode-count interfacing with the visual cortex. The work includes the creation of thin flexible probes that cause minimal tissue damage; new electrode coatings that will be stable even with long-term repeated electrical stimulation; and novel microchip methods for combining online channeling of the stimulation currents to many thousands of electrodes. It also combines stimulation with the monitoring of neuronal activity in higher cortical areas. We also developed new deep learning algorithms that transform the camera footage into stimulation patterns for the cortex and that use feedback on recorded brain states and eye tracking to improve perception in a closed-loop approach. The algorithms were mapped onto a low-latency, power-efficient hardware platform, to create a future neuroprosthesis system that is robust, and portable.
We designed, simulated and fabricated a CMOS ASIC for neural recording and stimulation, featuring 8 stimulation units, 128 output stages and 64 recording channels. The stimulation units offer flexible programming of pulse amplitude, frequency, duration and polarity. The recording channels provide low noise, low power amplification, filtering and digitization of multi-unit neural activity. To support in-vivo experiments with the flexible probes, we developed a modular data acquisition system with stackable, lightweight headstages and a base station unit (Viper Box), which can support the stimulation of more than 1000 channels simultaneously.
We completed the design of a neural network FPGA accelerator, VPDNN, that supports the required features of the phosphene stimulation network from SKU. Weight pruning and bit quantization of the network parameters were carried out to match the bit precision supported by the hardware. The system works with inputs from a camera and the network output was transmitted successfully to the recording and stimulation CMOS ASIC. Additional features were added to support the closed-loop setup with the NeuraViPeR probes in an animal system.
For validating the NeuraViPeR probe technology, mice were implanted with flexible polyimide probes in the primary visual cortex (area V1). Microstimulation through the electrodes successfully evoked a behavioral response in mice trained on a go/no-go stimulation detection task. We were able to monitor the stimulation threshold over up to 13 months revealing a stable perceptual threshold. To test direct efficacy of microstimulation on the neuronal recruitment we implanted mice with the multi-shank NeuraViper probe and combined microstimulation with 2-photon imaging.
A Reinforcement Learning (RL)-based framework was developed to optimize phosphene generation for navigation tasks while maintaining performance under dynamic perturbations. An in-silico RL-based prosthesis controller was designed for neural system restoration using recurrent neural networks. A biologically plausible phosphene simulator optimized stimulation parameters for naturalistic vision, modeling perceptual characteristics. A high-dimensional Bayesian optimization framework enabled patient-specific multi-electrode stimulation based on direct feedback. Gaze-contingent processing and eye movement integration were explored for neuroprosthetic vision in VR-based tasks. Two biologically inspired RL algorithms improved training efficiency using predictive processing and input decorrelation.
We have conducted simultaneous recordings and stimulation experiments over a 6-month period in the primary visual cortex of three blind subjects. We tested advanced stimulation protocols and refined methods to improve phosphene perception, adaptability, and usability. By incorporating real-time adjustments based on brain responses and patient feedback, we optimised stimulation settings for clearer, more stable perceptions. To ensure practical use, we tailored tasks to each subject’s interests and integrated eye-tracking technology to enhance usability.