Days later, Facebook CEO Mark Zuckerberg declared that “Direct brain interfaces [are] going to, eventually, let you communicate only with your mind.” The company says it has 60 engineers working on the problem.
Makoto Fukushima, a fellow at the National Institutes of Health who has used brain interfaces to study the simpler grunts and coos made by monkeys, says the richer range of birdsong is why the new results have “Important implications for application in human speech.”
Current brain interfaces tried in humans mostly track neural signals that reflect a person’s imagined arm movements, which can be coopted to move a robot or direct a cursor to very slowly peck out letters.
So the idea of a helmet or brain implant that can effortlessly pick up what you’re trying to say remains pretty far from being realized.
The team at UCSD used silicon electrodes in awake birds to measure the electrical chatter of neurons in part of the brain called the sensory-motor nucleus, where “Commands that shape the production of learned song” originate.
The same song, as predicted from neural recordings inside the finch’s brain.
A device able to read the commands your brain sends out to muscles while you are engaged in subvocal utterances is probably a lot more realistic than one that reads “Thoughts.”