Look, No Hands! MIT Controls Robotic Aircraft Using Motion Control
If the Kinect has taught us anything, it's that gesture control is good for more than gaming. While gesture-based control systems still have a lot of growing up to do, researchers over at MIT have found a way to control drone aircraft with hand gestures.
PhD student Yale Song and his team devised a system that recognizes 24 different--and extremely precise--gestures and postures based on signals used by aircraft carrier deck crews to control robotic planes before takeoff and after landing. The camera works a bit like the Kinect; it can track different body movements and "combine" them to recognize the command.
The researchers realized that using hand gestures was the most natural way humans communicate with each other, so why not use it to communicate with technology, too?
Needless to say, we're not talking the big passenger planes here--at least not yet--and the technology itself is still in its infancy. As of right now, the system accurately recognizes gestures only 76% of the time, but Song hopes to improve accuracy--and make the system capable of learning new gestures and giving feedback (for instance, it could tell you if it does or does not understand a command).
This could be pretty useful when planes are on the ground, but ground personnel would have to be really careful that they don't pause to itch their foot or something...
Check out the video below to learn more about Song's system and the gestures involved.
Like this? You might also enjoy...
- Cambridge Sets Lasers to Vaporize, Removes Toner From Paper
- 'Real Steel' Comes to Life With This Motion-Controlled Boxing Robot
- Microsoft Develops a Faster Touchscreen With Only One Millisecond of Lag