A means of controlling augmented reality (AR) smartglasses in a more natural and effective way than any other solution on the market today.
Smartglasses are bringing heads-up information to workers everywhere.
This is making them more productive, reducing error rates in the field, and ushering in the era of Industry 4.0. However merely displaying information to the user is not enough, they must have a way to control it, interact with it, and navigate their way from one step to the next. Drop the ball here, and you are left with a high-priced tech toy in search of usecase.
Each major leap in computing form factor has been accompanied by a corresponding breakthrough in interaction method. Just like the mouse ushered in the era of personal computers, and the touchscreen the era of smartphones; gestures are needed to unlock the potential of smartglasses.
While the concept is deceptively simple for the user, you just reach out, touch, grab, and swipe on the digital objects you see, the behind-the-scenes technology that makes it at all possible is anything but.
Dedicated algorithms and deep machine learning are needed to not only recognize user's hands, but also understand the motion, intent, and action they are performing through a specific gesture.
Without gestures, smartglass users are left with small buttons or tiny touchpads on the side of the glasses to enable clicking and scrolling navigation. These controls are usually small, difficult to find, and hard to use (especially if the user is wearing gloves, or must keep their hands sterile during a medical procedure).
This disconnect between the richness of the content being displayed, and such archaic interaction results in a massive experience gap. Fortunately, gesture recognition software and computer vision is available to bridge that gap and deliver to users the experience that they dreamed of and envisioned when they first heard of smartglasses.
So what is gesture recognition?
It is as simple and as straightforward as it sounds.
Just as you might gesture in the air to emphasize a point when you’re talking, with gesture recognition software, you can motion in the air to control your smartglasses. Without gesture capabilities, you may be limited to using small button or touchpad controls, but with gestures, you can interact directly with the content displayed in the air in front of you as you would with a touchscreen device like a tablet or smartphone.
Gesture recognition software typically uses cameras or depth sensors already built into your glasses to read and interpret what your hands are doing. Depending on the smartglasses and on the quality of the gesture recognition software, users can employ natural hand and finger gestures, head movements, and voice controls to interact with their data via their smartglasses (remember the Tom Cruise movie, Minority Report?).
Touchscreen control, but in the AiR™.
With Atheer’s AiR Gestures™ software, available only on select smartglasses equipped with a depth sensor, such as Atheer’s AiR Glasses™, you have the same precise control as you do now on your tablet or smartphone. You can pinch, swipe, double tap, long tap, long hold, and more as you interact with the content displayed in the air in front of you.
As you can see from the video below, there is a minimal learning curve in order to achieve comfort with these controls, as these gestures are familiar to anyone using today’s mobile technology. Our AiR™ technology delivers a wider peripheral visual space and no limit to the amount or size of screens used to overlay computer-driven data onto the real world. [Insert demonstration video]
AR is driving this next generation of smartglasses computing.
Device control can and should be whatever comes naturally. What comes very naturally is interacting with the actual content. Atheer’s AiR Gestures™ delivers that natural interaction.
With sub-millimeter accuracy with regard to tracking fingertips and gestures at speeds that can go up to 250 frames per second, Atheer has created a natural, flowing experience for its users. We have created a world in which you are no longer chained to a keyboard or physical screen, one in which you have virtual touchscreens that scale almost limitlessly in size and number.
This takes the value of your smartphone and multiplies it by a huge factor. That is the richness that AR smartglasses with gestures adds to people’s lives.
All of Atheer’s AiR™ hardware and software technology starts and ends with our user’s experience. What we’re offering people through our AiR Gestures™ software is the experience of interacting with their data in a natural way on a new computing platform.