Monkeys can now feel virtual objects using a brain implant.
This could be the first step towards virtual reality where you can feel the computer-generated world around you.
An international team of neuroengineers has developed a bi-directional brain-machine interface. That means the monkeys can use this brain implant not only to control a virtual hand, but also to get feedback that tricks their brains into “feeling” the texture of virtual objects.
"Feeling" the texture of virtual objects.
How it works:
When you’re wearing a pair of big bulky gloves, the sensory information usually provided to your brain by your fingers is deadened by the barrier between your hand and your keys. The result is a one-way interface; your brain can tell your fingers what to do with the keys, but communication from your fingers back to your brain is effectively cut off. As a result, you have to rely on another sense — usually vision — to tell if you’re currently pinching one key, three keys, or no keys at all.
To really make the most of your fingertips, there needs to be a two-way interface between your brain and your hands. When your brain can receive tactile information from your hands about, say, the texture of the key you’re handling, it can make near-instantaneous adjustments that give you better dexterity, or help you choose the right key.
Brain-machine interfaces have come a long way in recent years, but, with few exceptions, these systems have depended pretty much exclusively on one-way interfaces.
To demonstrate the power of a two-way interface, a team of neuroengineers at Duke University designed a brain-machine-brain interface (BMBI) to test on monkeys.
“This is the first demonstration of a brain-machine-brain interface that establishes a direct, bidirectional link between a brain and a virtual body,” said Miguel Nicolelis, who led the study. “In this BMBI, the virtual body is controlled directly by the animal’s brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal’s cortex.”
Here’s how it all works: the BMBI takes movement commands from 50—200 neurons in the monkey’s motor cortex and uses them to control the operation of a virtual, “avatar” hand, not unlike a classical one-way interface. But the new interface also implements a feedback mechanism, wherein information about a virtual object’s texture is delivered directly to the brain via something known as intracortical microstimulation, or “ICMS” for short. When a monkey receives feedback in the form of ICMS, thousands of neurons in its brain (neurons that actually correspond to tactile feedback in the hands) receive electrical stimulation via carefully placed electrodes.
This two-way interface allows for the monkeys to engage in what the researchers call “active tactile exploration” of a virtual set of objects. Using only their brains, monkeys were able to direct their avatar hand over the surfaces of several virtual objects and differentiate between their textures.
To prove that the monkeys could pick out specific objects based on tactile feedback, the researchers would reward monkeys for selecting objects with a specific texture. When they held their virtual hand over the correct object, they were given a reward. The study looked at the performance of this task by two monkeys. It took one monkey just four attempts to learn how to select the correct object during each trial; the second, only nine.