Gesture controls coming soon to your phone or tablet
BARCELONA—Gesture control technology has already found a place in TVs, in gaming systems like Microsoft Kinect, and in at least one free-standing interface from Leap Motion, which lets you control your PC with hand movements.
The next frontier is mobile devices, and several companies I talked to here are pushing their own solutions to the problem to device makers.
Most of these companies have no products, only prototypes. But device makers are very interested in letting us control their devices with hand gestures that mimic the touch gestures we already use to direct the operating systems and apps in our devices.
I saw gesture technologies here in Barcelona that use either ultrasound, infrared light, or a camera to detect the gestures of the user. Let's start with ultrasound.
Recognizing gestures with sound
The Norwegian company Elliptic Labs showed off its ultrasound-based gesture tech at MWC.
The prototype tablet it demonstrated for me had a series of small holes around the edges of the screen that emit ultrasound bursts.
These bursts bounce off the user's hands and return to receiving sensors on the device. This action tells the device what gestures the user is making in front of, or off to the sides of, the screen.
The company hopes to license its “Windows 8 Gesture Suite” to Windows-based tablet makers. The Suite lets the tablet recognize a series of gestures that are based on the touch gestures already understood by the OS, like the oft-used side-swipe movements.
Gesture for the camera
The Israeli company PointGrab has a somewhat simpler way of enabling gestures.
The company's technology is just a piece of software with algorithms that use the phone or tablet's existing camera to detect hand motions, as shown in the image above.
CEO Haim Perski points out that the solution is easier than other gesture technologies, because there's no additional technology that a device maker has to design into the device.
Perski demonstrated the tech by using it to control a camera app that his company developed. The app, called CamMe, lets you control the camera on your phone or tablet hands-free by making a hand-closing gesture in front of the camera. The app then waits a few seconds and takes the shot.
Motion and light
The Swedish company Neonode takes yet another approach to the problem.
Neonode's idea is to let the user control the action on the screen by gesturing on the table around a smartphone or on the borders of the screen on a tablet.
The company showed me a smartphone outfitted with a plastic sleeve with tiny holes around the sides that emit infrared light. The pulses of light bounce off the users fingers to detect motion.
The app in the demo was a set of drums on the screen that you could play by tapping on the table top near the corresponding drums and cymbals on the screen. It's easy to see how the technology could be used in gaming control.
Neonode is also working with some large automakers on infrared-based gesture control for drivers.
The company showed me a steering-wheel prototype with rows of infrared light sensors over which the driver could move his fingers to control things like audio volume, heating and air conditioning, headlight brightness, and even a telephone dialer. All of these controls show up on a screen built into the windshield.
If operating the stereo and answering phone calls in the car can be done without taking one's hands off the steering column, it could truly make driving in today's connected cars a lot safer.
When will gesture control arrive?
It's hard to say exactly how mature and stable these gesture control technologies would be in real-life products. But the demos I saw here seemed to work well, and the benefit to the user is clear.
I'd wager that gesture control will show up in tablets first. Gestures are a natural in tablets because we like to watch video and play games on the larger screens, and the video and games would look a lot better if not viewed through a layer of fingerprints. Also, gesture controls allow us to keep our greasy mitts off the screen.
By Mobile World Congress 2014, a handful of phones and tablets will very likely have shipped with some type of gesture control. Ultrabooks and other PCs with built in gesture control should arrive quickly too.
Gesture control will take a little longer to show up in cars because automakers tend to move more slowly in building in new tech. But in-car gesture control seems like a winner of an idea. Watch for it on show floors around 2015 or 2016.
For comprehensive coverage of the Android ecosystem, visit Greenbot.com.