CTRL-Labs is Building a Non-Invasive Unidirectional Interface

DARPA. Neuralink. Kernel. These are just a few of the organizations betting hugely on the invention of a noninvasive neural interface. Another company called CTRL-labs continues to work on a goal that sounds similar. NYC-based CTRL-labs was founded in 2015 by Thomas Reardon, Patrick Kaifosh, and Tim Machado. They were are all graduates of a Columbia neuroscience PhD program. 

The mission of their company is to develop non-invasive neural interface technology for humans to interact with different types of computing machines and devices. They’ve received three rounds of venture capital funding totaling USD 67 million since their founding. The product they perform demonstrations with is their SDK called CTRL-kit. 

During demonstrations CTRL-kit decodes neural signals that are collected when the user moves his arm, hand or fingers by contracting his muscles. The system fundamentally “encodes the intention” of the user using surface electromyography. This creates a synthesis between human and machine by-way-of non-invasive interface.

Here’s a demonstration by CEO Thomas Reardon

Currently, the highest resolution neural interfaces require that a patient undergo a craniotomy to place microelectrodes directly into brain tissue. While research into effective neural interfaces has been performed on animals and disabled individuals whose maladies were caused by brain and spinal cord damage, the risks associated with surgery make the procedures less than ideal. In order to function with current microelectrode technology, DARPA’s proposed nonsurgical neural interface must have a high spatiotemporal resolution and low latency. DARPA also requires that the noninvasive neural interface allows for both neural recordings can be taken, and for neural stimulation to be given.

That is the key difference in what DARPA is trying to create and what CTRL-labs is pursuing. DARPA’s recent public request for proposal calls for a bidirectional neural interface that can read and write in 3-4 years, which is their average length to complete a project. Their goal is to help the warfighter of the near future ward off dangerous threats including AI-human hybrid controlled Lethal Autonomous Weapons (LAWS). 

CTRL-labs is not interested in creating a neural interface that can read and write, just one that can read. CTRL-labs is creating a unidirectional noninvasive interface to create a way for the mind to control different computing interfaces and IoT devices. (Image courtesy of Lux Labs).

CTRL-labs Acquired Patents from Myo

CTRL-labs announced a recent acquisition of patents from Myo. Myo is a gesture-and-motion control armband created by North, a startup company funded to the tune of USD 170 million in venture capital from investors including Amazon. When it was on the market, Myo retailed for USD 199. The product enabled amputees to control a prosthetic hand. It also enabled surgeons to navigate different screens during complicated surgery. In the fall of 2018, North pulled the armband from the market to focus all internal resources on creating a new product. The new product turned out to be a pair of holographic smart glasses named Focals. They cost USD 999 and come with an extra piece of hardware called Loop that fits around a user’s finger like a ring.

Pictured here is the second product from North, the USD 999 holographic Focals. Focals have a built-in display that displays information from your phone. Information like the weather and messages are visible and you can request information by asking Amazon’s assistant Alexa. You can also order an Uber. Focals connect via Bluetooth to Android and iOS devices and are manipulated by a remote control you wear as a ring on your finger. You can interact with the glasses by clicking your ring hardware. North still provides customer care to customers who bought the Myo armband. (Image courtesy of North.)

CTRL-Labs has released an SDK called CTRL-kit they’ve been working on for quite some time. Now it’s ready for developers to get ahold of this technology and create applications for gesture and motion-controlled user experiences. The noninvasive neural interface is intended to have single-neuron resolution, and CTRL-kit allows Unity and JavaScript developers to “translate electrical activity from a person’s muscles and into digital control schemes.” The kit comes with a wireless electromyography armband and affords access to joint APIs.

To increase the amount of developers working on the project, CTRL-Labs joined the Khronos Group’s OpenXR organization to establish universal standards for integrating EMG-based neural interfaces with XR tech like augmented reality and virtual reality.

Bottom Line

As more companies and organizations enter the noninvasive neural interface area, expect big news from leaders like Neuralink. The Musk-founded organization alerted members of the press and public that they have big news coming up this Tuesday. 

Neuralink is developing ultra-high bandwidth brain-machine interfaces (BMIs) to connect humans and computers. The announcement is likely to be a small-sized device that helps treat or monitor the effects of damage caused by serious brain injuries.

CTRL Labs is part of a growing engineering phenomenon to merge the brain more directly with computing resources than it is now. With our smartphones on 24/7 for reference, guidance and entertainment, we can think of ourselves as cyborgs with non-contiguous parts. Perhaps in some way we are indeed already part of a cybernetic collective. The pursuit of working noninvasive bidirectional neural interfaces to become an AI-human hybrid poses hundreds of challenges. 

It is nearly irresistible to speculate on engineering challenges and outcomes. How does one account for minute differences in brain chemistry and neuronal signals? Can a universal directory of commands be achieved if users are engaged in a form of digital information telekinesis? With DeepMind’s Alphastar AI playing StarCraft, and Facebook’s (and Carnegie Mellon’s) Pluribus AI beating the world’s best poker players, AI continues it’s slow and steady march of beating humans at their own games.