Facebook Finally Explains Its Mysterious Wrist Wearable
But Facebook has visions for this wrist tech beyond AR and VR, Bosworth says. “If you really had access to an interface that allowed you to type or use a mouse—without having to physically type or use a mouse, you could use this all over the place.” The keyboard is a prime example, he says; this wrist computer is just another means of intentional input, except you can carry it with you everywhere.
Bosworth also suggested the kitchen microwave as a use case—while clarifying that Facebook is not, in fact, building a microwave. Home appliance interfaces are all different, so why not program a device like this to understand, simply, when you want to cook something for 10 minutes on medium power?
In the virtual demo Facebook gave earlier this week, a gamer was shown wearing the wrist device and controlling a character in a rudimentary video game on a flat screen, all without having to move his fingers at all. These kinds of demos tend to (pardon the pun) gesture toward mind-reading technology, which Bosworth insisted this is not. In this case, he said, the mind is generating signals identical to the ones that would make the thumb move, but the thumb isn’t moving. The device is recording an expressed intention to move the thumb. “We don’t know what’s happening in the brain, which is full of thoughts, ideas, and notions. We don’t know what happens until someone sends a signal down the wire.”
Bosworth also emphasized that this wrist wearable is different from the invasive implants that were used in a 2019 brain-computer interface study that Facebook worked on with the University of California at San Francisco; and different from Elon Musk’s Neuralink, a wireless implant that could theoretically allow people to send neuroelectrical signals from their brains directly to digital devices. In other words, Facebook isn’t reading our minds, even if it already knows a heck of a lot about what’s going on in our heads.
Researchers say there’s still a lot of work to be done in the area of using EMG sensors as virtual input devices. Precision is a big challenge. Chris Harrison, the director of the Future Interfaces Group in the Human-Computer Interaction Lab at Carnegie Mellon University, points out that each individual human’s nerves are a little bit different, as are the shapes of our arms and wrists. “There’s always a calibration process that has to happen with any muscle-sensing system or BCI system. It really depends on where the computing intelligence is,” Harrison says.