During its developer conference in 2017, Facebook declared its plans to develop a brain-computer interface (BCI) that may allow you to type simply by thinking. Now, researchers from the University of California, San Francisco (UCSF) operating under this program have revealed a study today noting their algorithmic rule was able to discover spoken words from brain activity in real-time.
The team connected high-density electrocorticography (ECoG) arrays to 3 epilepsy patients’ brains to record brain activity. Then it asked these patients straightforward questions, and asked them to answer aloud.
Researchers said the algorithmic rule recorded the brain activity whereas patients spoke. They noted the model decoded these words with accuracy as high as 76 %.
Facebook said it doesn’t expect this technique to be accessible anytime shortly, however it might soon make interaction with AR and VR hardware terribly easy:
“We don’t expect this technique to resolve the matter of input for AR anytime soon. It’s presently large, slow, and unreliable. However the potential is critical, thus we believe it’s worthy to keep improving this progressive technology over time”.
“And whereas measuring oxygenation might never permit us to decode imagined sentences, having the ability to recognize even a couple of imagined commands, like “home,” “select,” and “delete,” would supply entirely new ways in which of interacting with today’s VR systems — and tomorrow’s AR glasses.”
Earlier this month, Elon Musk’s Neuralink also declared a project that will allow you to manage your iPhone via a tool connected to your brain. Whereas these devices might not hit the market in a few years, it’ll be an exciting space to watch out for.