How A Brain Implant & AI Gave A Paralysed Woman A Voice
This new research helped Anna Johnson
Ann Johnson. Credit: Noah Berger
If you have ever awoken in a state of sleep paralysis - unable to move momentarily - you are fully aware of how scary it is to be locked inside your body, unable to communicate. Imagine if this was your reality for nearly 20 years. For Ann Johnson, that was her life until AI drastically changed her life.
In 2005, Ann was a teacher, a volleyball coach, and a mother to a 2-year-old daughter when she suffered a stroke that left her totally paralyzed and unable to speak. Five years after the stroke, she was able to smile on command. Ten years later - she was able to eat some solid food again.
Thankfully, she could slowly communicate with her husband at home with a special letter board. She would look at the letter for a word she wanted to spell, and her husband would decipher the word.
She could also use a set of eyeglasses that help her compose emails, which functions like the letter board - she looks towards specific areas, and a computer system can type out the words she signals with her eyes.
This is the same technology that Steven Hawkings used, and with these glasses, she could type out about 14 words a minute.
Then, in recent years, Ann and her family heard about research on a stroke victim that allowed him to communicate more freely again.
Dr. Edward Chang, at the University of California, led this research, and after a 2021 article about his work, Ann and her husband reached out to him.
Ann was persistent, and even though she lived all the way in Canada, she was soon added as a participant in the San Francisco studies.
Since last September, Anna has frequently travelled for three days, over 1700 miles, to be part of the research.
It all began with an operation introducing an implant on the top of her skull with 253 electrodes that intercept the neuro activity in Ann’s head. From there, this data gets transmitted to an AI model that attempts to decode the signals. And it’s working; Anna’s ability to communicate has gone from 15-18 words per minute to 80 words.
Ann also partnered with a team that uses sensors to pick up on subtle movements in her face and replicate them onto an onscreen avatar - giving her the ability to emote using her face again. When Anna wants to say a sentence, the nearby avatar says the words out loud with her desired facial expressions.
The results were published last month in the journal Nature. The most amazing part about the research is just how accurate the AI is when transmitting Ann’s thoughts — 75%!
This level of accuracy might not seem like a great starting point to some, but this is leaps and bounds for someone unable to communicate. As we always say, AI will never be this bad again, and 75% is a leap above the 50% accuracy the research had two years ago.
Ann’s long-term goal is to become a counsellor eventually. In 2018, she started taking online courses to start that path. Ann told Dr. Chang and his team that this was her “shot at the moon goal”. The reality is that this might no longer be quite so impossible.
When we talk about AI in the media, we often fear it. But for all the fear that surrounds this topic - it can provide hope for people like Ann Johnson. It provides a means to help people communicate again. It provides a way for her new voice to tell her husband she loves him. This technology will allow Ann to help others in the future. AI is helping.
💬 For the Group Chat!
Copy and paste it — we won’t tell anyone.