Brain-computer interface

A brain-computer interface (BCI) or direct neural interface is literally a direct technological interface between a brain and a computer not requiring any motor output from the user. That is, neural impulses in the brain are intercepted and used to control an electronic device. This is a rather broad, ill-defined term used to describe many versions of conventional and theoretical interfaces. For purposes of this term, the word 'brain' is understood to imply the physical brain of an organic life form and 'computer' is understood to imply a mechanical/technological processing/computational device. These semantic notations are crucial in the contemplation of a direct brain-computer interface, as there is great debate in the philosophy of mind regarding the reduction of consciousness and mind to the physical qualities of the brain. Because of cortical plasticity, the brain is likely to adapt during learning to operate a BCI.

Neuroprosthetics
Simple brain-computer interfaces already exist in the form of neuroprosthetics, with a great deal of neuroscience, robotics, and computer science research currently dedicated to furthering these technologies. Recent achievements demonstrate that it is currently possible to implement crude brain-computer interfaces (brain dishes) that allow in vitro neuronal clusters to directly control computers. Laboratories led by investigators Andrew Schwartz (U. Pittsburgh), Richard Andersen (Caltech), Miguel Nicolelis (Duke), and John Donoghue (Brown University) have all successfully used a variety of algorithms, including the vector sum of motor cortical neuron spiking, to record directly from the cortex of monkeys - as a BCI. This design allowed a monkey to navigate a computer cursor on screen, as well as command a robotic arm to perform simple tasks, simply by thinking about moving the cursor, without any motor output from the monkey.

BCIs in monkeys


Studies that developed algorithms for reconstruction of movements from the activity of motor cortex neurons date back to the 1980s. Work by groups led by Schmidt, Fetz, and Baker in the 1970s established that monkeys could quickly achieve voluntary control over the firing rate of individual neurons in primary motor cortex using a closed-loop operant conditioning paradigm. Phil Kennedy and colleagues had success in building the first wireless, intracortical brain-computer interface by implanting neurotrophic cone electrodes first into monkeys and then into the brains of paralyzed patients. Real-time reconstruction of more complex motor parameters using recordings from neural ensembles has been explored by several groups including research groups lead by Miguel Nicolelis, John Donoghue, Andrew Schwartz, Richard Andersen and more recently Krishna Shenoy, Nicho Hatsopoulos, Ad Aertsen, Eilon Vaadia, Lee Miller, Dawn Taylor, and Eric Leuthardt.

During the last 10 years of explosive development of BCIs, Miguel Nicolelis was the most prominent proponent of multiunit, multiarea recordings from neural ensembles as the recording technique to obtain the high-quality neuronal signals to drive a BCI. After conducting initial studies in rats during the 90s, Nicolelis and his colleagues started to develop BCIs that decoded brain activity in monkeys and used it to reproduce monkey movements in artificial actuators. Monkeys have advanced reaching and grasping abilities and good hand manipulation skills. Nicolelis' group conducted experiments in owl monkeys. By 2000, they accumulated experience in implanting owl monkeys with electrode arrays in multiple brain areas and were able to construct a BCI that reproduced monkey movements during operating a joystick or reaching for food (Wessberg et al., 2000). The BCI operated in real time and could control a robot. However, that was an open-loop BCI because the monkeys did not receive any feedback about the BCI performance.

Other leading labs that develop neuroprosthetic decoding algorithms include John Donoghue from Brown University, Andrew Schwartz from the University of Pittsburg and Richard Andersen from Caltech. These researchers conducted experiments in rhesus macaques which are considered to be better models for human neurophysiology than owl monkeys. Although these researchers initially could not record from as many neurons as Nicolelis and coworkers (15-30 neurons versus 50-200 neurons), they were able to make important advances by expanding the BCI capabilities. A study from Donoghue group (Serruya et al., 2002) reported a BCI in which monkeys, without any training on the BCI, tracked visual targets on a computer screen (closed-loop BCI). Schwartz' group created a BCI for three dimensional tracking (Taylor et al., 2002). Andersen's group incorporated in their BMI design cognitive signals recorded in posterior parietal cortex, such as encoding of reach target and anticipated reward (Mussalam et al., 2004). John Donoghue and Nicho Hatsopoulos also took this research to the business arena by starting Cyberkinetics, the company that puts development of practical BCIs for humans as its major goal.

The group of Miguel Nicolelis also started experiments in rhesus monkeys. In 2003, Jose Carmena and Mikhail Lebedev in Nicolelis lab conducted experiments in the monkeys that were trained to reach and grasp objects on a computer screen by manipulating a joystick (Carmena et al., 2003; Lebedev et al., 2005). Their BCI used velocity predictions to control reaching movements. The BCI simultaneously predicted hand gripping force. Reaching and grasping was produced by a robot, which remained invisible to the monkeys (direct interaction with the robot still needs to be demonstrated), and the feedback of the robot's performance was provided by a visual display.

In addition to predicting kinematic and kinetic parameters of limb movements, BCIs that predict electromyographic activity of muscles are being developed (Santucci et al. 2005). Such BCIs could be used in neuroprosthetic devices that restore mobility in paralyzed limbs by electrical stimulation of muscles.

References
 * Carmena, J.M., Lebedev, M.A., Crist, R.E., O’Doherty, J.E., Santucci, D.M., Dimitrov, D.F., Patil, P.G., Henriquez, C.S., Nicolelis, M.A.L. (2003) Learning to control a brain-machine interface for reaching and grasping by primates. PLoS Biology, 1: 193-208.
 * Fetz E.E., Baker M.A. (1973) Operantly conditioned patterns on precentral unit activity and correlated responses in adjacent cells and contralateral muscles. J Neurophysiol. Mar;36(2):179-204.
 * Kennedy, P.R., Bakay R.A. (1998) Restoration of neural output from a paralyzed patient by a direct brain connection. Neuroreport. Jun 1;9(8):1707-11.
 * Lebedev, M.A., Carmena, J.M., O’Doherty, J.E., Zacksenhouse, M., Henriquez, C.S., Principe, J.C., Nicolelis, M.A.L. (2005) Cortical ensemble adaptation to represent actuators controlled by a brain machine interface. J. Neurosci. 25: 4681-4693.
 * Musallam S, Corneil BD, Greger B, Scherberger H, Andersen RA (2004) Cognitive control signals for neural prosthetics. Science 305: 258-262.
 * Santucci, D.M., Kralik, J.D., Lebedev, M.A., Nicolelis, M.A.L. (2005) Frontal and parietal cortical ensembles predict single-trial muscle activity during reaching movements. Eur. J. Neurosci., 22: 1529-1540.
 * Serruya MD, Hatsopoulos NG, Paninski L, Fellows MR, Donoghue JP (2002) Instant neural control of a movement signal. Nature 416: 141-142.
 * Taylor DM, Tillery SI, Schwartz AB (2002) Direct cortical control of 3D neuroprosthetic devices. Science 296: 1829-1832.
 * Wessberg J, Stambaugh CR, Kralik JD, Beck PD, Laubach M, Chapin JK, Kim J, Biggs SJ, Srinivasan MA, Nicolelis MA. (2000) Real-time prediction of hand trajectory by ensembles of cortical neurons in primates. Nature 16: 361-365.

Human BCI research
There have also been experiments in humans utilizing modern invasive and non-invasive neuroimaging technologies as interfaces. The most commonly studied potential interface for humans has been electroencephalography (EEG), mainly due to its fine temporal resolution, ease of use, portability, and cost of set-up. However practical use of EEG as a BCI requires a great deal of user training and is highly susceptible to noise. In 2004 scientists of the Fraunhofer Society utilized neural networks to shift the learning phase from the user to the computer and thus recorded noticeable results within 30 minutes of training. Magnetoencephalography (MEG) and even functional magnetic resonance imaging (fMRI) have both been used successfully as rudimentary BCIs, in the latter case allowing two users being scanned in real-time to play Pong against one another by altering their haemodynamic response through various biofeedback techniques.

Matt Nagle is one of the first people to use a direct brain-computer interface to restore functionality lost due to paralysis.

Practical BCIs
There has been great success in using cochlear implants in humans as a treatment for non-congenital deafness. There is also promising research in vision science indicating retinal implants may some day prove to be similarly successful in treating non-congenital blindness. Outside the realm of treatment, there has also been success in reconstructing an individual's view of a visual scene from the electrical impulses recorded from that person's visual cortex. One purpose of motor neuroprosthetics would be to restore independent control of the body and assistive devices to individuals with paralysis due to a variety of causes including spinal cord injury, amyotrophic lateral sclerosis, muscular dystrophy, multiple sclerosis, spinal muscular atrophy, certain cerebellar disorders and certain types of stroke.

Theme in fiction
Direct neural interface devices were the prominent feature of the popular Matrix film series, wherein humanity was enslaved by artificially intelligent robots in a virtual world piped directly into people's brains by true Immersive Virtual Reality apparatus. These interfaces are an extremely common element of cyberpunk fiction, often characterised as "control of hardware at speed of thought". In the meme wars novels of John Barnes, direct mind-computer interface permits the emergence of a hegemonic collective mind called One True.

Brain-computer interfacing is a particularly notable feature of these science fiction stories: and these anime stories:
 * Gridlinked by Neal Asher
 * the "Culture" novels and stories of Iain M. Banks
 * Eon by Greg Bear (and sequels)
 * "Learning to Be Me" (short story) by Greg Egan
 * This Alien Shore by C.S. Friedman
 * Neuromancer by William Gibson, and other Cyberpunk novels
 * The Night's Dawn Trilogy by Peter F. Hamilton
 * Altered Carbon by Richard Morgan
 * Accelerando by Charles Stross (see also exocortex)
 * Aristoi by Walter Jon Williams
 * Angelic Layer
 * The Ghost in the Shell manga and related anime
 * Neon Genesis Evangelion
 * Serial Experiments Lain