One of the oddest but coolest aspects of the recent CBS show Star Trek Picard is the hand gestures. Cristóbal Rios, the captain of a starship that Picard hires out for his mission, is able to direct the ship just by moving his hand in the air. Now researchers at MIT have come out with a crude version of this system that uses sensors to control drones and other robots.
Called Conduct-a-Bot, it relies on a series of arm sensors that measure both movement and muscle tension. Together the sensors recognize 8 distinctive commands a drone pilot can make by moving their arm and tensing different muscles, like the biceps and triceps. For instance, the setup can measure the different movements of flexing the wrist up, down, left, or right. Or it can pick up a clenched fist or stiffened upper arm.
These various movements are translated into instructions to move the drone in different ways. These are universal gestures that the system has been trained to recognize on anyone. Someone can just don the sensors and start driving, with no need to calibrate the system to their particular movements first.
But we’re a long way from piloting a starship, or even a drone, really. MIT published a video of someone using the system to navigate a Parrot Beebop 2 drone through a series of hoops. It can be done, ultimately, but with a lot of herky-jerky movements and pauses in between. It’s certainly a long way from what someone can do with today’s dual-stick controllers.
And accuracy has a ways to go too. MIT reports that the system correctly measured 85% of 1,500 gestures in testing. That’s impressive for a first version, but far from adequate to avoid crashes.
But controllers full of sticks, levers, and buttons can be intimidating for a lot of people. If the MIT system can be made much more sensitive and accurate over time, it might provide a new, intuitive way for more people to interact with drones and other robots.