Researchers at Chiba University in Japan have developed an advanced deep learning model that significantly improves the interpretation of brain signals via electroencephalography (EEG). The so-called Topology-Aware Multiscale Feature Fusion (TA-MFF) network offers a major leap forward in decoding so-called motor imagery signals. This is brain activity that occurs when someone visualises or imagines a movement without actually performing it.
EEG is a non-invasive method that measures the electrical activity of the brain via electrodes placed on the skull. This technique is widely used, from neurological diagnostics to research into brain-computer interfaces (BCI) and robotic prostheses. However, decoding these signals remains a challenge: brain activity is complex, noise-sensitive and variable over time.
Smarter AI for brain signals
To overcome these obstacles, PhD student Chaowen Shen and professor Akio Namiki developed a deep learning network that takes into account the topological structure of brain signals. Whereas existing AI models mainly recognise spatial and temporal patterns, TA-MFF goes further by also analysing spectral (frequency-related) and topological relationships between electrodes.
The system consists of several modules that work together:
- The Spectral-Topological Data Analysis (S-TDA-P) module uses persistent homology – a mathematical method for recognising patterns in complex datasets – to find deeper connections between brain areas.
- The Inter-Spectral Recursive Attention (ISRA) module identifies important frequency patterns and filters out irrelevant information.
- The Spectral-Topological and Spatiotemporal Feature Fusion (SS-FF) module combines these insights into a single integrated brain profile.
The research was recently published in Knowledge-Based Systems.
Application in healthcare and rehabilitation
Initial test results show that TA-MFF performs significantly better than existing methods in decoding motor imagery EEG signals. In the long term, the technology could be used in neuroprostheses, wheelchairs, or computer interfaces that users with physical disabilities operate with their thoughts.
‘Our approach makes it possible to interpret brain signals more accurately, which opens the door to more natural interaction between humans and technology,’ says Professor Namiki. ‘This can help people with limited mobility to live more independently.’
The researchers see the TA-MFF model as an important step towards smarter and more reliable brain-computer interfaces that can better recognise not only movements, but also emotions and intentions. This innovation brings the future of brain-controlled technology a lot closer.
AI EEG
Back in 2019, a neurologist at MST stated that AI technology (in Dutch) could make neurological diagnoses faster, more accurate and more efficient. With the help of AI, EEG measurements can be analysed automatically. This saves time and produces results that are at least as reliable as those of human experts.
The neurologist also conducted research into how long-term EEG measurements can help in the treatment of epilepsy. Together with the Danish company UNEEG Medical, work was done on the introduction of a mini-electrode that measures brain activity for months.