Brain Computer Interfaces: The Key To Self-Expression

Sophia Tang
5 min readSep 20, 2021

Imagine if translating our deepest thoughts and emotions into words was intuitive. Crazy right? Well, believe it or not, the technology that will soon allow us to type our thoughts with only our mind already exists, and it’s called the Brain Computer Interface (BCI).

“If only the ability to express oneself was easy. Imagine how much fuller the world would become.” — Written by Me

A brain communicating with a computer.
Source: nature.com

You’re probably wondering: What is a Brain Computer Interface? And, how can a piece of technology possibly decipher the mass of jumbled thoughts that travels through my mind at every moment of the day? If I guessed what you were thinking, then you have just experienced a preview of the potential of a Brain Computer Interface.

What Is a Brain Computer Interface?

In the 1970s, the original definition of a BCI was developed. It stated:

“The goal of BCI technology is to give severely paralyzed people another way to communicate, a way that does not depend on muscle control.”

Fast-forward to 2021; the Brain Computer Interface has become not only a way for paralyzed people to communicate, but a technology that can and will revolutionize human society.

In a nutshell, a Brain Computer Interface is the system that records signals from your brain and transfers it to a computer, where it is interpreted (or given meaning) by an algorithm.

A simple BCI follows a method as follows:

1. Electrical signals are sent between neurons in the brain

The human brain consists of approximately 86 billion neurons. Each of these neurons consists of dendrites, a soma (cell body), and an axon. The brain is like the internet of the human body, where individual neurons are the devices that send and receive messages across a network of devices. Neurons form a “network” through axon-dendrite junctions called synapses. Chemical signals — called neurotransmitters — are received by the cell’s dendrites. The neurotransmitters convert the cell’s charge to positive, and in turn fires what is called an action potential — which occur when the neurotransmitters travel through the axon and into the dendrites of the adjacent neuron.

The structure of a neuron.
The structure of a neuron. (Source: Medium)

2. A recording device (EEG, ECoG, Neural Dust, fMRI, etc.) detects the signals and sends them to an external computer

Apologies for throwing a bunch of acronyms at you, but believe me, the actual terms get really weird.

There are several types of recording devices that can record brain activity as computer-readable data. The most common non-evasive (aka no surgery) recording device is EEG — or electroencephalography (long, I know) — which measures the post-synaptic potentials that occur when the neurotransmitters reach a cell’s dendrites. Since EEGs are placed on the scalp, they can only detect the activity of large groups of neurons that are active at the same time.

Electroencephalography (EEG)
Electroencephalography (EEG) (Source: imotions.com)

A more advanced recording device under development is called Neural Dust. Like its name, Neural Dust are tiny sensors that are only 3 millimeters long! These sensors are not only able to record the electrical activity on a single nerve in the nervous system, but they are also able to stimulate activity in neurons —don’t worry, we are far from achieving mind control…for now.

Neural Dust
Neural Dust (Source: UC Berkeley)

Researchers predict that, in the near future, these devices will be shrunk down to have a width half that of a strand of human hair! At that size, they will be able to attach directly to the axon of a neuron and record its electrical activity. Because of Neural Dusts’ ability to record the activity of a single neuron in a specific area of the brain, its applications are endless.

Some incredible applications of Neural Dust include:

  • Epilepsy treatment —by sensing abnormal brain activity prior to a seizure and using electrical pulses to terminate the seizure-inducing activity before a seizure is felt by the patient.
  • Brain-controlled prosthetics — by placing the sensor within the motor-function area of the brain, it can receive and send the interpreted signals to a prosthetic in order to control its movement like it were part of the body.
  • Mental health treatment — by placing neural sensors in the brain, researchers can better understand the specific brain activity associated with a patient’s psychiatric illness, and be able to regulate that activity using stimulations.

3. The computer interprets the signals using a machine learning algorithm

How does a computer derive meaning from invisible signals might you ask? Great question! BCIs use a special type of algorithm called machine learning algorithms. Simply put, a machine learning algorithm is a trained identifier of patterns in brain activity. For example, you can train a computer to detect the action of blinking simply by blinking several times wearing an EEG, and telling your computer that each of those actions was a blink. The computer will detect a pattern in your repetitive actions and, after enough training, be able to identify a blink by looking at only your brainwaves!

The differences in brain waves during different events. (Source: NeuroTechX on Medium)

Machine learning algorithms can be trained to detect the brain activity associated with almost anything, including emotions, mental illnesses, and speech! Its only major limitation is that it requires lots of data points for maximum accuracy.

Conclusion

So, how exactly will we be able to we type our thoughts with only our minds? The short answer is: using a machine learning algorithm. But, since you’ve made it this far through my article, I assume you want to hear the long version. Here it goes! A machine learning algorithm can be trained by recording the brain waves of our thoughts as we speak. After recording a lot (and I mean a lot) of speech samples, it can start to make connections between words and the brain waves of the thoughts associated with them, allowing the algorithm to be able to identify the words associated with one’s thoughts. There is still no clear way to train computers to interpret humankind’s most complicated emotions; however, with how far BCIs have come, a future where self-expression and empathy are intuitive is not as far beyond the horizon as you might think.

--

--

Sophia Tang

17y/o exploring her passions and sharing her journey through writing ✍🏻 talks about biotech, emerging technologies, and personal growth 🧬 | sophiaytang.com