Yes, There’s Really A Neural Interface at CES That Reads Your Brain Signals
Imagine commanding a computer or playing a game without using your fingers, voice, or eyes. It sounds like science fiction, but it’s becoming a little more real every day thanks to a handful of companies making tech that detects neural activity and converts those measurements into signals computers can read.
One of those companies — NextMind — has been shipping its version of the mind-reading technology to developers for over a year. First unveiled at CES in Las Vegas, the company’s neural interface is a black circle that can read brain waves when strapped to the back of a user’s head. The device isn’t quite yet ready for primetime, but it’s bound to make its way into consumer goods sooner rather than later. Top ArticlesREAD MOREA New Wave of Space CompaniesIs Coming. Can It Help Lifeon Earth?https://imasdk.googleapis.com/js/core/bridge3.494.0_en.html#goog_255241879https://imasdk.googleapis.com/js/core/bridge3.494.0_en.html#goog_241693233https://imasdk.googleapis.com/js/core/bridge3.494.0_en.html#goog_339735042https://imasdk.googleapis.com/js/core/bridge3.494.0_en.html#goog_925585834
Neural interfaces are already here
Neural interfaces have the potential to support a wide range of activities in a variety of settings. A company called Mudra, for example, has developed a band for the Apple Watch that enables users to interact with the device by simply moving their fingers — or think about moving their fingers. That means someone with the device can navigate music or place calls without having to interrupt whatever they’re doing at the time. It also opens tremendous opportunities for making tech available to people with disabilities who have trouble with other user interfaces.
NextMind has gone all in on virtual reality and augmented reality. Tech journalist Scott Stein paired the device, which sells for $399, with an Oculus Quest. He said the experience of using his mind to play a game where he made aliens’ heads explode using only his thoughts was, “rough, but also mesmerizing.” NextMind indicates what a user can select by making sections of the screen gently flash. The user makes their selection by “clicking” on one of those flashing sections. In this case, clicking means focusing, sometimes for quite some time.
Stein calls NextMind an “imperfect attempt at creating an input,” but he also says the device was pretty good at figuring out which section of the screen he was trying to select. “Out of a field of five or so on-screen flashing ‘buttons,’ this really did know what I was looking at,” he said.
Whether you like it or not, machines that can literally read human brains are coming to consumer electronics. What’s the worst that could happen?