Skip to main content Skip to secondary navigation
3D printed T5 brain w/ implants indicated
Welcome to the

Neural Prosthetics Translational Lab (NPTL)

The man who controls computers with his mind (cover-image), New York Times Magazine, 2022

The brain-reading devices helping paralysed people to move, talk and touch, subtitled "Implants are becoming more sophisticated and are attracting commercial interest, Nature, News Feature, 2022

Handwriting BCITencent WE Summit, 2021

Implantable BCIsMIT Tech Review2021

Cursor BCIPBS News Hour, 2017

Dr. Greg Dunn's artwork, which is deeply neurally inspired, appears here (licensed 2022), at Caltech, and with the CMU / Carnegie's Mind & Brain Prize (including the 2018 prize for Prof. Shenoy: BMI Microetching and News & Picts).

Jaimie Henderson, Co-director and Co-PI of NPTL, BrainGate Site PI

Krishna Shenoy, Co-director and Co-PI of NPTL, BrainGate Site Co-PI

Main content start
HHMI Horizontal

We conduct neuroscience, neuroengineering and translational research to better understand how the brain controls movement and to then design medical systems to assist people with paralysis. These medical systems are referred to as brain-computer interfaces (BCIs), brain-machine interfaces (BMIs) and intra-cortical neural prostheses. We conduct this research in our Neural Prosthetics Translational Lab (NPTL) which focuses on fundamental human neuroscience and translational research with people with paralysis. Prof. Jaimie Henderson and Prof. Krishna Shenoy are the PIs of the NPTL.

Our modeling and computational work is done in close collaboration with Assistant Prof. Shaul Druckmann (Department of Neurobiology, Stanford) and Adjunct Prof. David Sussillo (Director, CTRL-Labs West Coast, a division of Reality Labs / Meta Platforms).  This is done as part of various NIH BRAIN, NINDS and NIDCD awards, a Simons Foundation Collaboration on the Global Brain (SCGB) program and the BrainGate2 clinical trial. We also conduct this research in our Neural Prosthetic Systems Lab (NPSL) which focuses on fundamental computational and systems neuroscience, neuroengineering and electrical engineering. Prof. Krishna Shenoy is the PI of the NPSL.

HumanBrainAndAreas

Neuroscience     We investigate the neural basis of movement preparation and generation using Utah electrode arrays, Neuropixel arrays, behavioral (e.g., eye and arm tracking, EMG measurements) and computational and theoretical methods (e.g., dimensionality reduction, dynamical systems, single-trial neural trajectory analysis, recurrent neural networks, deep neural networks). Questions include:

1. How do neurons in premotor and primary motor cortex prepare and generate arm, hand and finger movements? Including attempted movements, which form the foundation for new classes of BCIs. For example, as reported in Willett et al. (2022) Nature, an Attempted Handwriting BCI (see concept sketch below).

WillettEtAlNature2021-ConceptSketch.
Willet et al. Nature (2021)

2. How neurons in ventral premotor and primary motor cortex and inferior frontal gyrus (IFG) / Broca's area prepare and generate speech movements? Including attempted speech movements, which form the foundation for new classes of BCIs. For example, as reported in Stavisky et al. (2019) eLife, an Attempted Speech BCI (see concept sketch below)

BCI-Speech

Neuroengineering     We investigate the design of high-performance and highly-robust BCIs. These systems decode neural activity from the brain into control signals for restoring lost motor and communication abilities (e.g., Principles of Neural Science, 6th Edition, Chapter 39. This work includes statistical signal processing, machine learning and real-time system design. Questions include how to design BCIs rivaling the communication rate of spoken language.

Translational     We investigate BCI clinical translation and highly-related human neuroscience questions through the multi-site clinical trail that we are a part of (Timeline on BrainGate2NCT00912041 on clinicaltrials.gov). Questions include how best to bring neuroscience discoveries into clinically-viable BCIs to help people with paralysis in real-world settings (i.e., translation) and also the reverse, how BCI technology and questions arising from observations of human cortical processing back into the fundamental, pre-clinical research realm (i.e., reverse translation). See concept sketch below.

NaturalAndBCISystemConceptSketch

Here is a brief PBS News Hour video by Cat Wise and Judy Woodruff describing our Computer Cursor BCI that translates neural activity from the motor cortex of volunteer clinical trial participants into control signals to guide an on-screen cursor in two dimensions as well as a click signal, which is used to make selections (e.g., select a key on a keyboard). We reported this system in Pandarinath*, Nuyujukian*, et al. eLife 2017 pdf.

Judy Woodruff: For decades, researchers have worked to create a better and more direct connection between a human brain and a computer to improve the lives of people who are paralyzed or have severe limb weakness from diseases like ALS. Those advances have been notable, but now the work is yielding groundbreaking results. Special correspondent Cat Wise has the story. It's part of our Breakthroughs reporting and for our weekly segment about the Leading Edge of science and technology.

Cat Wise: Dennis Degray is a 64-year-old quadriplegic who is writing a sentence on the computer screen in front of him using only his brain. A former volunteer firefighter, Degray had a bad fall 10 years ago which severed his spinal cord. As part of an early stage clinical research study led by Stanford University, Degray and two other volunteer participants with ALS had small sensors implanted in their brains in an area called the motor cortex, which controls movement. Even though Degray can no longer physically move his arms, the neurons in that part of his brain, and in the brains of many other paralyzed individuals, remain active. The sensors in his brain listen in to those neurons, which emit different electrical signals depending on the direction Degray thinks about moving his hand. ...

Another brief video, by Megan Rosen at HHMI, describes how an Attempted Handwriting BCI translates neural activity from the motor cortex of volunteer clinical trial participants into the most probable English letter (26) or special character (5) in order to type words and sentences thereby restoring the ability to communicate. Note that any word can be typed and, therefore, this is an open vocabulary system. We reported this system in Willett et al. Nature 2021 pdf.


Rosen M (5/12/21) Brain computer interface turns mental handwriting into text on screen. Howard Hughes Medical Institute (HHMI). News article. pdf url