Seconds after a brief smile of anticipation flashed across her face, Jan Scheuermann moved a bar of chocolate toward her mouth by controlling a robotic prosthetic arm. Finally, she took a bite. As she relished the taste, the team of neuroscientists and engineers in the room erupted in applause. It was no small feat for Scheuermann to feed herself. Fourteen years before this 2012 experiment, she had been diagnosed with spinocerebellar ataxia, which causes progressive and irreversible paralysis. In the intervening years, she had gradually lost the ability to move her arms and legs.
Jan露出了预期的笑容后，她控制机械手臂把一块巧克力放到了嘴里，最红她咬了一小块。在她品味巧克力时候，神经科学和机械工程组成的团队爆发出了掌声， 这是很了不起的 Jan可以自己吃东西啦！ 14年前的2002年 被诊断得了脊髓小脑性共济失调，导致逐渐的并且无法恢复的身体瘫痪，随后的几年，她慢慢的失去控制手脚 的能力。
About three months before her chocolate-eating triumph, scientists at the University of Pittsburgh in Pennsylvania had implanted into her primary motor cortex two tiny arrays consisting of 100 electrodes, which enabled her to control the robotic arm—or Hector, as she had named it. Before the surgery, Scheuermann, who was 53 at the time, proclaimed that her goal would be to feed herself a sweet treat. In the following weeks, Scheuermann practiced using her brain to control Hector, moving the arm forward and backward, turning the wrist and closing the hand. Finally, she used Hector to feed herself a snack. As she noshed on her first glorious bite of the chocolate candy bar, she said, “One small nibble for a woman; one giant bite for BCI.”
在她能控制手臂吃巧克力之前三个月，来自宾夕法尼亚的匹斯堡大学的科学家在她初级运动皮层植入了两列共100个电极，以便让她能够控制手脚，她成为hector. 53岁的她声称她的目标就是想给自己吃点甜食，在后续的几周内，她用大脑控制hector, 比如向前后移动手臂 转动腰部，合手等动作， 他使用hector为自己零食，当她迟到巧克力，她说 对她来说不过吃到了美食，对BCI却是一大步
A BCI, or brain-computer interface, sounds like something from science fiction. After someone becomes paralyzed, either through progressive illness such as amyotrophic lateral sclerosis (ALS) or because of a sudden accident that severely injures the spinal cord, a BCI offers the possibility of restoring movement and, eventually, some level of independence. The first BCI was used by a paralyzed person to play a computer game, serving as a proof of concept that these devices could translate thought into action; since then, neuroscientists and engineers have developed superior systems1. The most recent models enable a user to move a prosthetic arm with up to ten degrees of freedom, which includes separating individual fingers and moving the thumb on a prosthetic hand2.
脑机接口像科幻小说一下， 很多人瘫痪 或者得了肌肉萎缩症 还有因为车祸伤了脊髓，脑机接口给了恢复运动能力的可能性，BCI最早被瘫痪病人用于玩电脑游戏，这也证明BCI确实可以将脑部信号转换为动作指令，最新的进展是让一位患者能够向上移动手臂大约10度，同事可以做到手指分离和移动大拇指。
As BCIs advance, however, it is becoming more of a challenge for these systems to interpret the more extensive neurological output from users’ brains and translate them into motion. To make this interpretation more efficient, neuroscientists are studying the ways in which healthy individuals use different types of sensory feedback, including what they can see and feel, and how they can sense their bodies in space to control their actions. Ultimately, the goal is to give people with paralysis the ability to move in the fullest extent possible, either by using robotic prostheses or by restoring movement to their own limbs.
Any voluntarily movement, such as reaching for a pen on a desk, requires neurons in the motor cortex to fire off electrical signals, each usually lasting about a thousandth of a second. In healthy individuals, neural activity from the motor cortex is transmitted down through the spinal cord to control the limbs. In people who have been paralyzed, however, the connection between the brain and the spinal cord has been disrupted; although the neurons in the brain are still functioning, the signals never reach the spinal cord or extremities. This is where BCIs can have an effect. They can interpret the patterns of electrical signals coming from the motor cortex to move external neuromotor prostheses, and one day, perhaps, even reanimate the body.
The electrical array that is placed in the brain for use in BCIs in clinical trials is known as the Utah array, and it was first developed in 1994 by Richard Normann, a neurophysiologist at the University of Utah in Salt Lake City3. In subsequent years, Normann worked with John Donoghue, a neuroscientist at Brown University in Providence, Rhode Island, to make the array suitable for implantation into humans. Now, a Salt Lake City company called Blackrock Microsystems manufactures the Utah array, as well as electronics and tools to commercialize clinical systems.
被植入的脑部电极矩阵是被熟知的Uath 阵列， 1994年被Utah大学的Richard发明，并在在后面的时间里，和布朗大学的John共同开发适合植入大脑的电极阵列，同时在盐城胡 他们成立了一个公司专门生产这种电极列，以及其他临床系统。
The most commonly used Utah array consists of a 10 by 10 grid of electrodes, 96 of which record neural activity and four of which are used for reference wires that are surgically implanted into the brain and connected to a wire bundle. These wires, each about a hair’s width, connect to a thimble-sized platform called a pedestal, which is mounted on the user’s skull. In most cases, a cable connects the pedestal to an amplifier, which digitizes the electrical signals from the brain and sends them to computers. With training, computer programs learn to recognize the unique patterns that are generated for each intended movement.
Whereas unmodulated signals from the motor cortex can be enough to prompt robotic prostheses to perform simple actions, sophisticated movements such as grasping an object while moving the arm up or down require the integration of sensory feedback in the brain. Sight, sensory neurons in the skin and proprioception—the brain’s ability to sense the body’s position in space—“are the three major feedback sources that all come together and play different roles at different times in controlling movement,” says Robert Gaunt, a biomedical engineer working with neuromotor prostheses at the University of Pittsburgh. Each of these types of feedback is important for a different reason. Visual feedback can help to assess how much more the arm needs to move to make contact with an object; tactile feedback can show how much force the arm and hand need to lift it and proprioception can indicate how the arm is moving. Most movements require combinations of these types of feedback in order to be executed with grace, and thus they are difficult to recreate with BCIs.
未被模式化的信号足够控制一些简单的动作，然后复杂的运动比如抓取物体就需要大脑的反馈，简单的说就是皮肤表层必须感知物体的具体位置，这些扮演不同角色感知在不同的时候发生，但都是为了控制动作，匹茨堡大学来的医学工程研究学者Robert如是说, 每一个反馈都是很重要的，视觉反馈能够帮助判断需要如何移动来接触到物体触觉反馈可以反馈出需要多大的力度去提取，主体就可以去移动物体。 为了能够顺利的执行大部分动作都需要这些反馈的整合，这些信号很难重新生成。
“If we really want to be able to restore dexterous and complex movements through robotic prosthetic limbs, we’re going to need to be able to restore all of these sensation components to a person as well,” Gaunt says. The difficulty is that the neurological signaling patterns that represent possible movements, including these additional types of feedback, are extremely complex, and they cannot be easily recognized. But scientists have not given up. They are busy equipping BCIs to better distinguish these complicated patterns to create more natural movement for people who have become paralyzed.
我们如果真想通过机器人手臂去执行复杂的工作，我们需要能够恢复所有的感触觉元素， 目前的难题用于识别动作的神经元信号模式包括其他的反馈十分长复杂的，不是那么容易就可以检测出，科学家们没有放弃，他们正在完善BCI 区分别更多的神经元放电模式，为瘫痪病人提供控制肢体的解决方案。
University of Chicago
Sensations of pressure: Sliman Bensmaia has used BCI to stimulate tactility in monkeys.
Andrew Schwartz, a neuroscientist also at the University of Pittsburgh, has been working to create a BCI that will enable the most dexterous movement seen yet with a robotic prosthesis. He and his team are conducting a clinical trial with the NeuroPort array, which uses multiple Utah arrays with a specific pedestal and amplifier to interpret motor cortex activity. This system enabled Scheuermann in 2012 to move a prosthetic arm with seven degrees of freedom in order to pick up the piece of chocolate: she could open and close the prosthetic hand and move its arm and wrist forward and backward, side to side and up and down4. In 2014, Schwartz’s team reported that Scheuermann could successfully move a robotic prosthetic arm with ten degrees of freedom. Her new capabilities included the ability to spread its fingers, to move the thumb independently from the other digits and to make a scooping motion2.
来自匹斯堡的神经学者安德鲁正在努力使用BCI用假肢做一些灵巧的动作，他的团队在进行着一项临床试验，实验主要是使用多点Utah阵列和放大器去解读皮层活动，这个系统在2012使Jan在七个自由度上控制手臂去抓取一块巧克力，她能打开关闭假肢手，同时做一些向前向后 移动手臂和腰部的动作，2014，Schwartz’s 团队报告说Jan能够在10个自由度内移动假肢手臂，包括可以张开手掌和单独移动大拇指，
Meanwhile, Leigh Hochberg, a clinical neuroscientist at Brown University and his team are testing the new BrainGate2 system, which also uses the Utah array. In the clinical trial of this new device, individuals who are paralyzed or who have locked-in syndrome, in which they are conscious but can move only their eyes, will have either one or two Utah arrays implanted into their motor cortex. The arrays can translate signals from the motor cortex to move either a prosthetic arm or a cursor, which would enable them to access e-mail or to use other software on a computer. Hochberg says that he hopes a possible advantage of recording from multiple arrays in the motor cortex will be to make more sophisticated control of a prosthesis possible, including better grasping movements. So far, he and his colleagues have recruited seven individuals for this work; they will recruit up to 15 people before the trial concludes. They have enabled their participants to use robotic prosthetics to hold on to objects, similarly to Schwartz’s work, and to move a computer cursor with the highest degree of control achieved so far5, 6. They have also worked to create programs that automatically recalibrate as users’ neural signals change over time, to allow for faster typing for longer periods of time7. Ideally, these advances will help people who have lost the ability to speak to communicate more efficiently with caretakers.
同时，布朗大学的临床神经学家Leigh正在测试脑门系统2，这套系统也是用了 Utah 阵列，有偏瘫或者动作困难症的可以移动眼球测试个体被植入两列电极，这些来自脑部的信号被传输给外部假肢或者鼠标，一边让他们可以打开邮件或者操作软件， Leigh表示多个阵列的信号将有可能使得控制复杂的动作的可能性增大，和swartz的相似，他们也可以使病人抓取物体，或者以目前最高的自由度去控制鼠标， 他们开发了套系统自动的校正用户的脑电信号使得更快的打字，理论上，这些优势将使得丧失说话能力的人有效的交流。
Making movements: Leigh Hochberg, who leads the BrainGate2 clinical trial.
In both of these examples, the systems execute activity that arises from a user’s motor cortex as he or she imagines moving his or her actual limbs. Participants can control movement of the robotic limb on the basis of what they see; if they notice that the limb is moving too far to the right or to the left, they can make minor adjustments. Reliance only on visual feedback with no access to the two other types of feedback mechanisms, however, has its limitations. When Scheuermann moved a prosthetic arm with ten degrees of freedom, Schwartz and his team discovered a peculiar problem: when she tried to grasp an object that she was looking at, she could not close her prosthetic fingers around it—she could only close her hand when she closed her eyes or when the research team took the object away2.
Schwartz views this challenge as a failure of his team’s technology. As Scheuermann attempted to use her arm to interact with objects in more specific ways, such as reaching for an object to bring toward herself, the computer did not recognize the electrical pattern fired by her neurons, and as a result, the robotic fingers failed to close. “These issues can be considered cognitive—they require making fairly complex predictions of what is going to happen in the near future,” Schwartz told Nature Medicine. In other words, as Scheuermann began to think about how she was going to handle each of the objects she picked up, or how the objects would be affected by her own movement, the neural patterns were not recognized properly and thus caused the BCI to move improperly. “We don’t have enough of an idea of how these complex problems are handled in the brain to build decoders for predicting what happens after the hand grasps the object.”
Schwartz把这个挑战是为团队技术的失败，当她想和物体发生更多的互动时，比如把一个东西抓取到她面前，电脑无法识别神经元释放的电信号模式，当然机械手臂无法关闭，这些问题被认为是认知领域的，这个需要做个复杂的预测去估算下一步发生什么，换句话说，当Jan 想象去处理她取到的东西，或者物体对自己运动本身的影响， 神经元模式并不能被正确的识别，这意味着BCI系统是有问题的，我们没有更好的想法去处理取到物体后该怎么做。
In the future, Schwartz and his team plan to continue to test how neurons in the motor cortex change their patterns of electrical activity when a person who is paralyzed intends to use tools in various ways, and how information is transmitted through the motor system during these more sophisticated movements.
Although visual feedback has enabled people to move robotic limbs with some success, adding other modes of feedback will help these systems to evolve more precise movement control. One idea is to stimulate the somatosensory cortex to generate a sense of proprioception to improve the ways in which quadriplegic users can carry out movements with prostheses. Proprioception is registered only on the subconscious level, yet it helps us to guide all of our daily physical movements.
“When patients lose this sense of proprioception, their ability to make movements becomes profoundly disrupted,” explains Lee Miller, a neuroscientist and director of the Limb Motor Control lab at Northwestern University. Miller is currently working toward using the Utah array in the primary somatosensory cortex, the area responsible for proprioception, to electrically stimulate neurons in ways that generate a sensation of limb movement.
Recently, Miller and his team used a Utah array to stimulate the primary somatosensory cortex with low levels of electricity—about 100,000 times less than the electricity used to turn on a light bulb—to create the sensation of directed movement in a single male Rhesus monkey8. They used a separate robotic limb to push the animal’s arm left or right while tracking the electrical activity in his primary somatosensory cortex. They trained the monkey to indicate the direction in which he felt his arm moving. The team stimulated neurons that seemed to correspond to specific directional movement while nudging the monkey’s arm in a different direction. In this case, the monkey demonstrated that he felt his arm move in a direction midway between the two stimuli. According to Miller, these results indicate that electrical stimulation in the corresponding areas could generate a feeling similar to the idea that the arm is moving.
Another tack for improving the control of robotic prostheses is to recreate a sense of touch in individuals controlling those devices. “When you’re grasping an object, you need to know whether you’re touching it or not, where you’re touching it and how much pressure you’re exerting on it,” says Sliman Bensmaia, a neuroscientist at the University of Chicago.
In 2013, Bensmaia and his team were able to train Rhesus monkeys to respond to direct cortical stimulation that simulated sensation in the animals’ individual fingers9. To do so, the researchers first mapped which neurons in the sensory cortex became active as they stimulated each finger. They then used electrical arrays to stimulate those same neurons. The monkeys indicated that they felt some kind of stimulation in the fingers when the corresponding neurons were stimulated with electricity. In further experiments, the team connected the arrays to a prosthetic finger, fitted with mechanical pressure sensors. The monkeys could detect differences in pressure just as well with the prosthetic finger as they could with their own fingers.
Pioneer in the field: John Donoghue, creator of the Utah array.
The creation of sensation involves electrically stimulating different parts of the brain, not passively recording from the organ, so it is still early days for this research in humans. Nonetheless, in September 2015, researchers from the US Defense Advanced Research Project Agency (DARPA) announced that they had created a neuroprosthetic device that enabled a 28-year-old man who had become paralyzed as a result of a spinal cord injury to feel pressure on individual fingers of his prosthetic arm through sensory cortex stimulation. The work has yet to be published, but researchers say that the person was able to distinguish between fingers with near-perfect accuracy. “The connection between the brain and the arm has, until now, run in only one direction,” Justin Sanchez, the acting director of the biological technologies office at DARPA, told Nature Medicine. Because of this, a paralyzed individual who is controlling a robotic prosthetic arm usually has to look at the arm to control it and track its movements. “By completing the circuit from limb to brain and providing a sense of touch to the arm system,” Sanchez explains, “we’re getting closer to being able to restore near-natural function to users of prosthetic limbs.”
Restoring original limb movement
The ultimate goal of these technologies is to restore movement and independence in people who have become paralyzed by enabling them to move their own bodies rather than a prosthetic. To this end, Hochberg and others have been combining forces with researchers at Case Western Reserve University in Cleveland, Ohio, as part of the BrainGate2 clinical trial to try to enable a man to move his paralyzed arm through BCI and functional electrical stimulation (FES), which uses currents to stimulate nerves in the arms or legs.
University of Pittsburgh Medical Center
Bionic limbs: The prosthetic arm used by Scheuermann.
At the Society for Neuroscience conference in October 2015, the team presented preliminary work on one participant who had a severe upper-level spinal cord injury that resulted in his inability to move his arms and legs. The team implanted two electrode arrays into his motor cortex, and he has since been able to control a virtual arm. Once his BCI was connected to an FES system that generated muscle contractions in his actual arm, he was able to move either his wrist or elbow from side to side, and open and close his hand.
“These are very simple, very preliminary results. Where we hope to go in the future is to have him be able to combine control of these multi-joint movements to be able to do a coordinated reaching movement,” says Bolu Ajiboye, who is part of the team from Case Western. Ajiboye underscores that there is a long road ahead before the person being treated can learn to combine some of these single-joint movements to create the complicated arm motions performed routinely by healthy individuals.
To improve these technologies, researchers are continuing to tease apart the ways in which various types of sensory feedback modulate the signaling output from motor neurons. Gaunt stresses that different types of feedback can be useful to control prostheses in varying circumstances. “The goal is to combine [them] so that all of these things can work together.”