CyberWarfare / ExoWarfare

What is a brain-computer interface? Everything you need to know about BCIs, neural interfaces and the future of mind-reading computers

Systems that allow humans to control or communicate with technology using only the electrical signals in the brains or muscles are fast becoming mainstream. Here’s what you need to know.

What is a brain-computer interface? It can’t be what it sounds like, surely?
Yep, brain-computer interfaces (BCIs) are precisely what they sound like — systems that connect up the human brain to external technology.

It all sounds a bit sci-fi. Brain-computer interfaces aren’t really something that people are using now, are they?
People are indeed using BCIs today — all around you. At their most simple, a brain-computer interface can be used as a neuroprosthesis — that is, a piece of hardware that can replace or augment nerves that aren’t working properly. The most commonly used neuroprostheses are cochlear implants, which help people with parts of their ear’s internal anatomy to hear. Neuroprostheses to help replace damaged optic nerve function are less common, but a number of companies are developing them, and we’re likely to see widespread uptake of such devices in the coming years.

So why are brain-computer interfaces described as mind-reading technology?
That’s where this technology is heading. There are systems, currently being piloted, that can translate your brain activity — the electrical impulses — into signals that software can understand. That means your brain activity can be measured; real-life mind-reading. Or you can use your brain activity to control a remote device.

When we think, thoughts are transmitted within our brain and down into our body as a series of electrical impulses. Picking up such signals is nothing new: doctors already monitor the electrical activity in the brain using EEG (electroencephalography) and in the muscles using EMG (electromyography) as a way of detecting nerve problems. In medicine, EEG and EMG are used to find diseases and other nerve problems by looking for too much, too little or unexpected electrical activity in a patient’s nerves.

Now, however, researchers and companies are looking at whether those electrical impulses could be decoded to give an insight into a person’s thoughts.

Can BCIs read minds? Would they be able to tell what I’m thinking right now?
At present, no. BCIs can’t read your thoughts precisely enough to know what your thoughts are at any given moment. Currently, they’re more about picking up emotional states or which movements you intend to make. A BCI could pick up when someone is thinking ‘yes’ or ‘no’, but detecting more specific thoughts, like knowing you fancy a cheese sandwich right now or that your boss has been really annoying you, are beyond the scope of most brain-computer interfaces.

OK, so give me an example of how BCIs are used.
A lot of interest in BCIs is from medicine. BCIs could potentially offer a way for people with nerve damage to recover lost function. For example, in some spinal injuries, the electrical connection between the brain and the muscles in the limbs has been broken, leaving people unable to move their arms or legs. BCIs could potentially help in such injuries by either passing the electrical signals onto the muscles, bypassing the broken connection and allowing people to move again, or help patients use their thoughts to control robotics or prosthetic limbs that could make movements for them.

They could also help people with conditions such as locked-in syndrome, who can’t speak or move but don’t have any cognitive problems, to make their wants and needs known.

What about the military and BCIs?
Like many new technologies, BCIs have attracted interest from the military, and US military emerging technology agency DARPA is investing tens of millions of dollars in developing a brain-computer interface for use by soldiers.

More broadly, it’s easy to see the appeal of BCIs for the military: soldiers in the field could patch in teams back at HQ for extra intelligence, for example, and communicate with each other without making a sound. Equally, there are darker uses that the army could put BCIs too — like interrogation and espionage.

What about Facebook and BCIs?  
Facebook has been championing the use of BCIs and recently purchased a BCI company, CTRL-labs, for a reported $1bnFacebook is looking at BCIs from two different perspectives. It’s working with researchers to translate thoughts to speech, and its CTRL-labs acquisition could help interpret what movements someone wants to make from their brain signals alone. The common thread between the two is developing the next hardware interface.

Facebook is already preparing for the way we interface with our devices to change. In the same way we’ve moved from keyboard to mouse to touchscreen and most recently to voice as a way of controlling technology around us, Facebook is betting that the next big interface will be our thoughts. Rather than type your next status update, you could think it; rather than touch a screen to toggle between windows, you could simply move your hands in the air.

I’m not sure I’m willing to have a chip put in my brain just to type a status update.
You may not need to: not all BCI systems require a direct interface to read your brain activity.

There are currently two approaches to BCIs: invasive and non-invasive. Invasive systems have hardware that’s in contact with the brain; non-invasive systems typically pick up the brain’s signals from the scalp, using head-worn sensors.

The two approaches have their own different benefits and disadvantages. With invasive BCI systems, because electrode arrays are touching the brain, they can gather much more fine-grained and accurate signals. However, as you can imagine, they involve brain surgery and the brain isn’t always too happy about having electrode arrays attached to it — the brain reacts with a process called glial scarring, which in turn can make it harder for the array to pick up signals. Due to the risks involved, invasive systems are usually reserved for medical applications.

Non-invasive systems, however, are more consumer friendly, as there’s no surgery required — such systems record electrical impulses coming from the skin either through sensor-equipped caps worn on the head or similar hardware worn on the wrist like bracelets. It’s likely to be that in-your-face (or on-your-head) nature of the hardware that holds back adoption: early adopters may be happy to sport large and obvious caps, but most consumers won’t be keen to wear an electrode-studded hat that reads their brain waves.

There are, however, efforts to build less intrusive non-invasive systems: DARPA, for example, is funding research into non-surgical BCIs and one day the necessary hardware could be small enough to be inhaled or injected.

Why are BCIs becoming a thing now?
Researchers have been interested in the potential of BCIs for decades, but the technology has come on at a far faster pace than many have predicted, thanks largely to better artificial intelligence and machine-learning software. As such systems have become more sophisticated, they’ve been able to better interpret the signals coming from the brain, separate the signals from the noise, and correlate the brain’s electrical impulses with actual thoughts.

Should I worry about people reading my thoughts without my permission? What about mind control?
On a practical level, most BCIs are only unidirectional — that is, they can read thoughts, but can’t put any ideas into users’ minds. That said, experimental work is already being undertaken around how people can communicate through BCIs: one recent project from the University of Washington allowed three people to collaborate on a Tetris-like game using BCIs.

The pace of technology development being what it is, bidirectional interfaces will be more common before too long. Especially if Elon Musk’s BCI outfit Neuralink has anything to do with it.

What is Neuralink? 
Elon Musk galvanised interest in BCIs when he launched Neuralink. As you’d expect from anything run by Musk, there’s an eye-watering level of both ambition and secrecy. The company’s website and Twitter feed revealed very little about what it was planning, although Musk occasionally shared hints, suggesting it was working on brain implants in the form of ‘neural lace’, a mesh of electrodes that would sit on the surface of the brain. The first serious information on Neuralink’s technology came with a presentation earlier this year, showing off a new array that can be implanted into the brain’s cortex by surgical robots.

Like a lot of BCIs, Neuralink’s was framed initially as a way to help people with neurological disorders, but Musk is looking further out, claiming that Neuralink could be used to allow humans a direct interface with artificial intelligence, so that humans are not eventually outpaced by AI. It might be that the only way to stop ourselves becoming outclassed by machines is to link up with them — if we can’t beat them, Musk’s thinking goes, we may have to join them.

 

from: https://www.zdnet.com/article/what-is-bci-everything-you-need-to-know-about-brain-computer-interfaces-and-the-future-of-mind-reading-computers/

see also: https://www.zdnet.com/article/musks-neuralink-uses-brain-threads-to-try-and-read-your-mind/