AI Mind-Reading To Help Paralysed People Speak

Artificial IntelligenceInnovationResearch

Facebook funded university project to use AI to read minds of paralysed patients and allow them to communicate

There is hope that paralysed patients may be able to better communicate in the future after a breakthrough in a Facebook-funded research project.

The project, carried out by doctors at the University of California in San Francisco, have apparently
developed machine-learning algorithms that are essentially capable of turning brain activity into speech.

Earlier this month the boss of Telsa Elon Musk outlined plans to connect human brains directly to computers, using a tiny chip implanted in human brains.

brain computing - Shutterstock - © agsandrew

Computer interface

His NeuraLink company said it planned to allow the human brain to be directly connected to a computer so as to allow the human to control the computer.

The idea is to help those patients with severe neurological conditions, and would allow the computer to analyse recordings of the brain using machine learning.

And the Facebook project is developing something similar.

Essentially doctors at the University of California worked on epilepsy patients who had already had recording electrodes placed on their brains to asses the origins of their seizures, ahead of surgery.

They then, according to the Guardian newspaper, turned the brain signals for speech into written sentences.

The breakthrough part of this is that they were the first to demonstrate how the intention to say specific words can be extracted from brain activity and converted into text rapidly enough to keep pace with natural conversation.

At the moment however, the brain-reading software works reportedly only for certain sentences it has been trained on, but scientists believe it is a stepping stone towards a more powerful system that can decode in real time the words a person intends to say.

“To date there is no speech prosthetic system that allows users to have interactions on the rapid timescale of a human conversation,” the Guardian quoted Edward Chang as saying.

Chang is a neurosurgeon and lead researcher on the study published in the journal Nature.

Epilepsy patients

The study essentially took three epilepsy patients who were about to have neurosurgery for their condition. Before their operations went ahead, all three had a small patch of tiny electrodes placed directly on the brain for at least a week to map the origins of their seizures.

During their stay in hospital, the patients, all of whom could speak normally, agreed to take part in Chang’s research. He reportedly used the electrodes to record brain activity while each patient was asked nine set questions and asked to read a list of 24 potential responses.

With the recordings in hand, Chang and his team built computer models that learned to match particular patterns of brain activity to the questions the patients heard and the answers they spoke.

Once trained, the software could identify almost instantly, and from brain signals alone, what question a patient heard and what response they gave, with an accuracy of 76 percent and 61 percent respectively.

Facebook back in 2017 said it planned to connect human brains to computers in the near future.

The firm back then highlighted how brain interface systems currently used in medical cases, allow people for eight words per minute for people, but that system requires the use of implanted electrodes (into the brain), and “that simply won’t scale”.

Instead Facebook was seeking to develop non-invasive sensors that can read the brain through hair, skin and skull.

Put your knowledge of artificial intelligence (AI) to the test. Try our quiz!

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio