Google Open Project beams Android apps to any Net-connected display
Forget Miracast and Wi-Di: Google wants to transform any web-connected screen into a public display for your smartphone or tablet. The Google Research blog recently published information about Open Project (as in “projection”) screen-sharing technology. Open Project relies on your Android device’s camera and a little server-side magic to bring your smartphone apps to the big screen.
Open Project can also take multi-user touch interaction if you're using a touch-enabled display. Just imagine: Instead of trying to figure out level 90 of Candy Crush Saga alone on your Galaxy S4, you could let all your friends try to pass the level using a 26-inch display.
That's the hope anyway. From the looks of it, Open Project is not as robust as Apple’s AirPlay mirroring, but remember that Open Project is in its infancy and is still only a research project at this point. That also means there’s no guarantee Google will continue developing Open Project into an actual Android feature.
Nevertheless, the concept is interesting, so let’s take a look.
Open Project starts by reading a QR code displayed on a Web page. Once read, the smartphone notifies the server and the QR code converts into a checkerboard pattern inside a square outline—the display area for the virtual projection.
Using your phone’s camera to track the phone’s movement, the checkerboard will move around the screen as you wave the phone in front of the screen. You can also shrink or expand the target display area by touching your phone’s screen.
Once you have the target display at the right size and location on the larger display, you can fire up the app you want to use to view it. If the display has a touch component, you can also interact with the app on the larger screen allowing you to flip through your photos, zoom in on a map, or play games. Open Project also accepts mouse-enabled input from the larger display.
On the back end, your phone is sending all the display data to an Open Project server, which sends your phone’s data to the target display. For touch or mouse inputs, the Web page on the display sends any interaction back to the servers and then back to your phone and then back to the display to reflect the interaction.
That sounds complicated, but it takes seconds to happen in real time.
Whether Open Project is fast enough for heavy duty gaming or other data-intensive applications is unknown. Regardless, it’s a neat idea for turning any connected screen into an Android display without any extra hardware.
The developers of the project also say the technology is very scalable, since most of the data crunching happens on your mobile device and not on the Open Project servers. All that’s needed for the technology to work is for developers to integrate the Open Project code library into their individual apps, a device with a rear-facing camera, and a Web-connected display.
For comprehensive coverage of the Android ecosystem, visit Greenbot.com.