By observing one another’s behaviour, humans mentally simulate the likely outcomes of a sequence of actions, enabling long-term planning. If computers could ‘read’ this planning sequence from brain activity, knowing what humans want to do next would enable them to also plan ahead, making interactions with us less reflexive and more natural. “Think of team sports where players work together intuitively, responding to the subtle cues of colleagues and planning their own actions accordingly,” says Florentin Wörgötter of the University of Göttingen, the project host. “We knew that planning intentions can be traced to brain signals, so we wanted to link devices to those signals.” The EU-supported Plan4Act project succeeded in developing an integrated system that enables neural signals decoding, to control signal generation for smart device operation. Key to the success of the system was the creation of a class of neural network-based decoders for predictive neural signals embedded in a field-programmable gate array (FPGA) – an edge computing device. Additionally, the team developed a method for easier FPGA programming allowing for decoder variants to be implemented ‘on the fly’.
Harnessing brain signals with an artificial neural network
The team started by collating information about predictive coding in different brain areas of monkeys. As signals in the brain consist of a sequence of electrical pulses, they had to be decoded to make them compatible with the type of signals (such as voltages) required by target devices. The understanding was that as the monkey brain is analogous to that of humans, findings would be transferable to the neural pulse sequences which encode the external world for humans. The predictive brain signals were first recorded, then decoded, enabling a control device to predict – and so produce – the required responses before being prompted, as is the case with conventional ‘on-time’ systems. While Wörgötter reports finding the underlying process itself relatively straightforward, the team did find the complexity of building a system that integrates the various components challenging. Additionally, a lot of effort went into identifying a powerful method for decoding the neural signals. “A simple perceptron with very few layers was sufficient for decoding purposes. This neural network was able to translate the brain signals into control signals for devices with a high degree of reliability. More complex networks did not improve on this,” remarks Wörgötter. The team tested a full installation, using signals from a monkey in the German Primate Center to operate a food dispenser in a smart house in Madrid. This smart device was controlled by a sequence of up to three physical actions. “By decoding the monkey’s predictive brain activity, the device could ‘predict’ the final steps of the monkey’s actions while still responding to the first step in the monkey’s plan,” explains Wörgötter.
Towards minimally invasive technology
The ability to control devices predictively, with little or no physical effort, would likely prove attractive to people with disabilities. However, this is still far from becoming a reality as the technology currently relies on invasive procedures, such as surgically implanted electrodes. “Obviously, for humans this is ethically problematic, so first we will have to develop novel minimally invasive electrodes. It could be 10 to 20 years before reaching clinical or home use,” says Wörgötter. Currently, the team is working on a higher degree of integration of the different system components while also tweaking proactive machine response prompts, such as predictive movement patterns.
Plan4Act, monkey, brain, neural signals, robot, machine, electrode, human, neural network, perceptron