.

Within arm's reach: New SDK from OnTheGo Platforms promises to make gesture recognition accessible to Google Glass apps

If you’ve ever mucked around with Google Glass, Siri—heck, any voice-command driven interface—you realize that communicating with our intelligent object friends can sometimes be a lesson in futility. But now, the next time these moments of fist-shaking Glass frustration occur, you may find that you can actually put those gestures to good use. Thanks to Ari, the new Augmented Reality Interface SDK from Portland’s OnTheGo Platforms.

Voice and touch control work well on smartphones, but Smart Glasses require a new, intuitive interface. OTG’s Ari™ SDK uses a single outward facing camera to track a user’s hand motions and gestures, enabling the user to interact naturally with the content being displayed in front of them. Ari™ empowers developers to move beyond the limitations of voice control and a small touchpad on the side of the glasses.

Ari™ is built on top of the Android OS. Currently, all of the Smart Glasses entering the market run the Android OS and have one built-in camera; therefore, OTG can remain device agnostic and distribute Ari™ across all of the major smart glass devices for the foreseeable future (Google Glass, Vuzix, Recon, Epson, Samsung). The result? A universal Smart Glasses user experience regardless of hardware.

How’s it work? Take a look.

Pretty cool, hunh? Others think so too.

Portland Business Journal: A nice gesture: Portland’s OnTheGo Platforms launches smartglasses software

Since its inception, the company has been working closely with smartglasses manufacturer Vuzix.… Vuzix CEO Paul Travers described Ari as “a leading solution in the space.”

TechCocktail Portland: OnTheGo Platforms Wants a Universal Smart Glass User Experience

The OTG team recognized that while voice and touch controls work well on smartphones, smart glasses technology needs a different take with a different interface. Part of the inspiration behind Ari, then, is to give developers the extra excitement they need to create amazing apps for the smart glasses realm.

Developers, enterprises, and smart glass OEMs can integrate Ari directly into their app, product, or hardware. Thus, Ari uses a single, outward-facing camera to track the user’s hand motions and gestures, thereby enabling them to interact naturally with their displayed content.

CNET: At last! Hand gestures can control Google Glass

The beauty of the this idea, according to its makers, is that any hand gesture can be tied to any action the developer might choose.

Which led me to the question: “ANY gesture?” I asked On The Go CEO Ryan Fink whether he might envisage, one day, Glass-wearers offering a middle digit to log on to Facebook. Or perhaps a gang sign to take a photo.

He saw where I might be going with this and offered a very measured, but tantalizing response: “We do have the ability to create any gesture, but we’re only releasing a small gesture set at this time. There will be more gestures added in the near future.”

For more information on the release, read the company’s blog post on Ari. For more information on the company, visit OnTheGo Platforms.