Translating Thoughts Into Actions

Our society is one that focuses on the needs of the few, rather than the needs of the many. Nowadays, people become fixated with what they cannot do, instead of appreciating what they can. For a short individual, this may mean a time-consuming climb to the top shelf of a library, just to get a book. However, this all seems irrelevant when one is reminded that there are people who are unable to complete everyday tasks such as feeding themselves, typing, and walking. It is not uncommon to take these “basic” abilities for granted and forget how vital it is to engage with the surrounding environment.

Unfortunately, many people suffer from serious neuromuscular impairments, limiting their body’s ability to send electrophysiological signals from the brain to various limbs and organs, and vice versa (Leuthardt, et al., 2009). Luckily, devices known as motor neuroprosthetics, or brain computer interfaces (BCIs), have been developed to help individuals independently perform various motor tasks by supporting the functions of the autonomous nervous system (Munih and Ichie, 2001). There are many types of BCIs including electroencephalography (EEG) based systems, electrocorticography (ECoG) based systems, and single-unit recording systems (Leuthardt, et al., 2006). All BCIs contain the same crucial features, which include signal acquisition, signal processing, and device output (Schwartz, et al., 2004).

2Figure 1: Each system is classified as invasive or non-invasive, have an approximate value for the size of the recording region required for proper signal recording and the area that is recorded. EEG systems are recorded over large areas on the scalp and therefore do not have a signals illustration, whereas ECoG recordings are done over a network of neurons, and the single unit systems record individual neurons (Leuthardt, et al., 2006).

Signal acquisition involves the real-time recording of the brain’s electrophysiological signals via electrodes (Leuthardt, et al., 2006). As seen in Figure 1, for EEG based system the electrode is typically placed on the scalp above the sensorimotor cortex, allowing for the recording of electrical signals (specifically lower frequency sensorimotor rhythms) over a broad cortical region (Schwartz, et al., 2004). Conversely, ECoG based systems involve placing electrodes directly on the surface of the cerebral cortex, providing better spatial and temporal resolution compared to EEG systems. This allows for the detection of a broader range of sensorimotor rhythms, including higher frequency rhythms (Leuthardt, et al., 2006).  Single-unit recording systems possess the highest spatial and temporal resolution and involve placing electrodes within the cerebral cortex. This allows for the recording of individual neuron action potentials, shown to be involved with a specific motor activity (Leuthardt, et al., 2009). Monitoring the activity of 50-200 individual neurons during a repeated motor task provides an accurate prediction of which neurons are involved, allowing researchers to determine where electrodes should be placed. The signal processing component involves decoding of the brain’s activity and using mathematical algorithms and complex statistical analyses to successfully correlate and convert electrophysiological signals into useful device commands (Leuthardt, et al., 2006). The command is then wirelessly transmitted to the neighboring motor neuroprosthetic, where the command is carried out, performing a physical task.

brain-computer-interface-4

Figure 2: Illustration depicting an example of motor neuroprosthetics that are currently being used by patients.  In this image,  there is a motor neuroprosthetic that allows the user to control a robotic arm using one’s thoughts (HowStuffWorks, 2007).

For a long time, devices such as motor neuroprosthetics were inconceivable. Nowadays, motor neuroprosthetics can support both conscious and unconscious motor functions of the autonomous nervous system (Leuthardt, et al., 2006). In fact, users are now able to utilize various electrical devices , robots, and prosthetics to self-feed, grasp and move objects (as seen in Figure 2), move cursors, type on keyboards, or surf the web, with nothing else but their thoughts (Leuthardt, et al., 2006; Velliste, et al, 2008). Due to advancements in nanotechnology, biomedical engineering, and robotics, the possibility of controlling robots with one’s thoughts has made the successful transition from fiction to reality. This allows for a level of patient independence that was not possible before, enabling the once impaired to translate their thoughts into actions.

 

References:

HowStuffWorks, 2007. How Brain-computer Interfaces Work [images online]. Available at <http://computer.howstuffworks.com/brain-computer-interface.htm>.

Leuthardt, E.C., Schalk, G., Moran, D., and, Ojemann, J.G., 2006. The emerging world of motor neuroprosthetics: a neurosurgical perspective. Neurosurgery, [online] 59(1), pp.1–14.

Leuthardt, E.C., Schalk, G., Roland, J. and, Moran, D.W., 2009. Evolution of brain-computer interfaces: going beyond classic motor physiology. Neurosurg Focus, 27(1), pp.1–21.

Munih, M. and Ichie, M., 2001. Current status and future prospects for upper and lower extremity motor system neuroprostheses. Neuromodulation, [online] 4(4), pp.176–86.

Schwartz, A.B., 2004. Cortical neural prosthetics. Annual review of neuroscience, [online] 27, pp.487–507.

Velliste, M., Perel, S., Spalding, M.C., Whitford, A.S. and Schwartz, A.B., 2008. Cortical control of a prosthetic arm for self-feeding. Nature, [online] 453(7198), pp.1098–1101.