The brain-reading devices helping paralysed people to move, talk and touch

  • 📰 Nature
  • ⏱ Reading Time:
  • 107 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 46%
  • Publisher: 68%

Loans Loans Headlines News

Loans Loans Latest News,Loans Loans Headlines

James Johnson is almost completely paralysed below the shoulders, but using a brain–computer interface, he can operate a computer and play video games – just by thinking. “I am always stunned at what we are able to do,” he says, “and it’s frigging awesome'

So far, the vast majority of implants for recording long-term from individual neurons have been made by a single company: Blackrock Neurotech, a medical-device developer based in Salt Lake City, Utah. But in the past seven years, commercial interest in BCIs has surged. Most notably, in 2016, entrepreneur Elon Musk launched Neuralink in San Francisco, California, with the goal of connecting humans and computers. The company has raised US$363 million.

“We have neural information; we know what that person who is generating the neural data is attempting to do; and we’re asking the algorithms to create a map between the two,” says Hochberg. “That turns out to be a remarkably powerful technique.”Asked what they want from assistive neurotechnology, people with paralysis most often answer “independence”. For people who are unable to move their limbs, this typically means restoring movement.

Gaunt then joined forces with Pittsburgh colleague Jennifer Collinger, a neuroscientist advancing the control of robotic arms by BCIs. Together, they fashioned a robotic arm with pressure sensors embedded in its fingertips, which fed into electrodes implanted in the somatosensory cortex to evoke a synthetic sense of touch. It was not an entirely natural feeling — sometimes it felt like pressure or being prodded, other times it was more like a buzzing, Gaunt explains.

Shenoy’s colleague Frank Willett suggested to Degray that he imagine handwriting while they recorded from his motor cortex . The system sometimes struggled to parse signals relating to letters that are handwritten in a similar way, such as r, n and h, but generally it could easily distinguish the letters. The decoding algorithms were 95% accurate at baseline, but when autocorrected using statistical language models that are similar to predictive text in smartphones, this jumped to 99%.

Chang’s group first worked on characterizing the part of the brain that generates phonemes and, thereby, speech — an ill-defined region called the dorsal laryngeal cortex. Then, the researchers applied these insights to create a speech-decoding system that displayed the user’s intended speech as text on a screen.

These case studies suggest that the field is maturing rapidly, says Amy Orsborn, who researches BCIs in non-human primates at the University of Washington in Seattle. “There’s been a noticeable uptick in both the number of clinical studies and of the leaps that they’re making in the clinical space,” she says. “What comes along with that is the industrial interest”.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 64. in LOANS

Loans Loans Latest News, Loans Loans Headlines