Progress Report

Liberation from Biological Limitations via Physical, Cognitive and Perceptual Augmentation[2] IoB Middleware

Progress until FY2024

1. Outline of the project

Aiming to build an "AI-assisted Trusted brain-machine interface (BMI) -CA*," six teams are collaborating to develop next-generation communication technologies for individuals with communication difficulties. In FY2024, we promoted our research based on three pillars and achieved significant results. The first pillar is the "Advanced Decoding of Brain Information and Establishment of a Mathematical Foundation," which includes speech decoding from electroencephalography (EEG), transmission of abstract concepts, and theoretical modeling of brain activity (Sasai, Hayashi, and Oizumi Teams). Second, we developed "Collaborative Interaction Technology via BMI," where multiple people cooperatively control a robot using their brainwaves (Arulkumaran Team). Third, we advanced the "Expansion of Interaction in Real and Virtual Worlds," using remote neural synchronization, body sensors, and VR technology to overcome physical constraints (Rekimoto and Koike Teams).

*AI-supported Trusted BMI-CA: A Cybernetic Avatar (CA) that can accurately decode words and actions that users imagine through machine learning of AI, according to combinations of heterogeneous BMIs.

2. Outcome so far

2.1 Advanced Decoding of Brain Information and Establishment of a Mathematical Foundation

The Sasai team dramatically improved speech decoding accuracy using large-scale EEG data and AI, achieving a high accuracy of 54.5% in recognizing the unspoken speech of patients (Fig. 1). The Hayashi team established the foundation for "X-Communication," a method to transmit abstract concepts that are difficult to verbalize via AI. The Oizumi team constructed a groundbreaking mathematical theory that quantifies the control cost of brain activity, and their findings were published in the prestigious academic journal "Physical Review X.

Fig.1
Fig.1 High-precision language decoding with large data
2.2 Development of Collaborative Interaction Technology via BMI

The Arulkumaran team has realized a technology where multiple people can cooperatively operate a robot through a BMI. They developed a system that allows two people to perform tasks together using their brainwaves, presenting a new form of collaboration between humans and robots.

2.3 Expansion of Interaction in Real and Virtual Worlds

The Rekimoto team succeeded for the first time in the world in elucidating the phenomenon of remote neural synchronization. They demonstrated that the brain activity of people in physically separate locations can synchronize if the communication delay is 450 milliseconds or less. They also developed "Silent Speech" technology using throat vibrations and the "GazeLLM" system, which combines gaze information with AI (Fig. 2).

Fig.2
Fig.2 GazeLLM: Generating task descriptions using only the pixels around the line of sight

The Koike team developed an innovative system that estimates 3D posture from foot pressure sensors. They created a system that can identify dangerous postures during luggage transport with high accuracy and issue real-time warnings. Furthermore, they developed a new method to streamline the acquisition of motor skills, such as a golf swing, using spatiotemporal distortion technology in a VR space (Fig. 3).

Fig.3
Fig.3 A VR environment for improving golf swing efficiency

3. Future plans

These research results have the potential to contribute to improving the quality of life for people with speech or motor disabilities in the future. However, many challenges remain for practical implementation. For the speech decoding system using brain waves, we need to further improve the current 54.5% recognition accuracy for patients. We also need to consider individual differences and usability in daily life. The collaborative work system, posture warning system using foot pressure sensors, and remote brain synchronization technology are currently at the laboratory demonstration stage, and verification of effectiveness in actual work environments remains a future challenge. Privacy and ethical considerations are also important.
In FY2025, we will work on solving these technical challenges while carefully advancing safety and ethical considerations. We also plan to begin basic verification toward practical implementation through collaboration with medical institutions and companies. We will proceed with research and development so that these technologies can be accepted by society and delivered to those who need them.