The University of Chicago Header Logo

Biomimetic Somatosensory Feedback through Intracorticalmicrostimulation


Collapse Overview 
Collapse abstract
Spinal cord injury causes both paralysis and loss of sensation from the limbs. The past 15 years have seen remarkable advances in ?Brain Machine Interfaces? (BMIs) that allow paralyzed persons to move anthropomorphic limbs using signals recorded directly from their brains. However, these movements remain slow, clumsy, and effortful, looking remarkably like those of individuals who have lost sensation from their arms due to peripheral neuropathy. Brain-controlled prosthetic limbs are unlikely to achieve high levels of performance in the absence of artificial sensory feedback. Early attempts at restoring somatosensation used intracortical microstimulation (ICMS) to activate somatosensory cortex (s1), requiring animals to learn largely arbitrary patterns of stimulation to represent two or three virtual objects or to navigate in two-dimensional space. While an important beginning, this approach seems unlikely to scale to the broad range of limb movements and interactions with objects that we experience in daily life. To move the field past this hurdle, we propose to replace both touch and proprioception by using multi- electrode ICMS to produce naturalistic patterns of neuronal activity in S1 of monkeys. In Aim 1, we will develop model-optimized mappings between limb state (pressure on the fingertip, or motion of the limb) and the patterns of ICMS required to evoke S1 activation that mimics that of natural inputs. These maps will account for both the dynamics of neural responses and the biophysics of ICMS. We anticipate that this biomimetic approach will evoke intuitive sensations that require little or no training to interpret. We will validate the maps by comparing natural and ICMS-evoked S1 activity using novel hardware that allows for concurrent ICMS and neural recording. In Aim 2, we will test the ability of monkeys to recognize objects using artificial touch. Having learned to identify real objects by touch, animals will explore virtual objects with an avatar that shadows their own hand movements, receiving artificial touch sensations when the avatar contacts objects. We will test their initial performance on the virtual stereognosis task without learning, as well as their improvements in performance over time. Aim 3 will be similar, but will focus on proprioception. We will train monkeys to report the direction of brief force bumps applied to their hand. After training, we will replace the actual bumps with virtual bumps created by patterned ICMS, again asking the monkeys to report their perceived sense of the direction and magnitude of the perturbation. Finally, in Aim 4, we will temporarily paralyze the monkey's arm, thereby removing both touch and proprioception, mimicking the essential characteristics of a paralyzed patient. The avatar will be controlled based on recordings from motor cortex and guided by artificial somatosensation. The monkey will reach to a set of virtual objects, find one with a particular shape, grasp it, and move it to a new location. If we can demonstrate that this model-optimized, biomimetic feedback is informative and easy to learn, it should form the basis for robust, scalable, somatosensory feedback for BMIs.
Collapse sponsor award id
R01NS095251

Collapse Biography 

Collapse Time 
Collapse start date
2016-06-01
Collapse end date
2021-05-31