Tuesday, September 3, 2013

How Gesture Control And Wearable Tech Will Revolutionize Our Digital Lives

 

How Gesture Control And Wearable Tech Will Revolutionize Our Digital Lives

by Toyota​Voice, forbes.comJuly 17th 2013

By Ryan Galloway

Pause your favorite show by snapping your fingers. Adjust the temperature in your home with a flick of the wrist. Scroll through your big presentation by waving your hand. Gesture control is the latest step in the evolution of human/computer interaction, replacing traditional interface methods like keyboards and touchscreens. It’s not science fiction. It’s not even speculative. It’s real, it’s available, and it can free you from your desk chair.

Gesture control set hearts racing at this year’s SXSW festival. The annual who’s who of digital innovators, investors, and artists flooded Twitter with buzz about several gesture control devices, including Leap Motion. The Leap Motion device packs multiple cameras and LEDs into a package smaller than a deck of cards and is, according to the company itself, 200 times more accurate than comparable technologies. This sensor detects the movements of your hand, and the device responds accordingly.

But camera-based systems like Leap Motion and Microsoft’s Kinect aren’t the only players on the gesture control stage.Thalmic Lab’s MYO, another SXSW darling, is wearable technology that uses the electrical activity in your muscles to control computers, smart phones, and other digital devices.

Thalmic Labs CEO and cofounder Stephen Lake sees distinct advantages for wearables like MYO. ”Camera-based systems can be great for gaming and entertainment applications, but they’re limited. They put you in a box. Step outside of that box and they stop working. Wearable technology frees the user up to move around, and that ability to move opens up a lot of possibilities for gesture control that weren’t there before.”

Lake sees the maneuverability of wearable tech as the key to taking gesture control beyond the desktop. “There’s been a lot of interest in MYO from hospitals and surgeons. Once a surgeon is scrubbed in to a clean environment like an operating room, they can’t interact with technology in the usual ways. If there’s a touchscreen present, they have to tell a nurse which button to press. Gesture control technology can free that nurse to do something more important.”

Gesture control systems can also improve the quality of life for a variety of people, and can unlock new opportunities for those living with disabilities. There are, according to Lake, clear implications for stroke victims and other patients in neuromuscalar rehabilitation programs. There’s also the possibility that products like MYO might have “sign language to speech” capability in the very near future.

Lake believes we’re witnessing a fundamental change in the nature of computing. “In five years, the devices we wear or carry won’t be little glass plates that we pull out of our pockets. They’ll be smarter and more aware of context. They’ll be able to provide relevant information based on your location and the people around you.”

“That’s the future of computing,” he said, “and gesture control will be an integral part of it.”

 

No comments:

Post a Comment