AI Helps Paralysed Woman Speak Via Digital Avatar

AI Helps Paralysed Woman Speak Via Digital Avatar

A woman who was severely paralysed for 18 years has been able to speak with the help of artificial intelligence (AI) through a digital avatar. The technology translated her brain signals into speech and facial expressions, allowing her to communicate.

Before AI, patients of paralysis relied on slow speech synthesisers that involved spelling out words using eye tracking or small facial movements, making natural conversation impossible.

In the new technology, a paper-thin rectangle of tiny 253 electrodes is implanted on the surface of the patient’s brain. The implant detects electrical activity in the part of the brain that controls speech and face movements. These signals are translated directly into a digital avatar’s speech and facial expressions such as smiling, frowning, or surprise.

After the implantation of the device, the 47-year-old patient worked with the team to train the system’s AI algorithm to detect her unique brain signals for various speech sounds by repeating different phrases. The computer learned 39 sounds, and a ChatGPT style model was created for making sentences from the signals. The avatar was then trained to sound like her based on her recording earlier than the injury.

During a test run, words were decoded incorrectly 28% of the time when more than 500 phrases were checked. The system generated brain-to-text at a rate of 78 words a minute, compared with the 110-150 words typically spoken in natural conversation.