Friday, May 31, 2013

When Will Smartglasses and Other Wearable Computers Hit the Mainstream?

When Will Smartglasses and Other Wearable Computers Hit the Mainstream?: Scientific American
Scientific American


Google has stoked our collective imagination via relentless promotion of its Google Glass wearable computer in recent months. Thanks to a campaign of Web videos, trade show appearances and blog posts, the search giant has positioned its smartglasses as a hands-free augmented reality gadget that will allow us to share our personal experiences in real time, whether we are skydiving, skiing or handling snakes.

Not surprisingly, a number of competitors have emerged, promising many of the same capabilities. Before augmented-reality eyewear can move into the mainstream, however, Google and its ilk must address some fundamental shortcomings in smartglass technology, including bulky designs, high prices and a dearth of software that would enable them to be more than head-mounted camera phones.

Google should aim for a headset with see-through lenses that allows the wearer to look straight ahead rather than at a small screen off to the side. So says Justin Rattner, a man who, along with his colleagues at chipmaker Intel, spends a lot of time engineering the future of computing technology. Rattner, an Intel Senior Fellow, serves as director of Intel Labs and as the company’s chief technology officer.

Scientific American spoke with him about why smartglasses are getting so much attention, how they should be improved and why Star Trek’s tricorders were a misguided interpretation of the future.

[An edited transcript of the interview follows.]

Wearable computers have been around for decades, including ring scanners that factory workers wear on their fingers to read barcodes and even prototype head-mounted units that enable augmented reality. Why is the technology getting so much attention now? The sensor technology, the communications technology and the computer technology have all reached a point where in some sense for the first time the potential for high-volume consumer wearables is real. That’s what’s new. Today you can put essentially everything that’s in a smartphone into a set of eyeglasses, although they would be a bit heavy. That potentially becomes an interesting platform for communications.

Why do you say “potentially”?
We think there is a grand challenge when it comes to eyewear. No one’s been able to demonstrate a high-performance see-through display. This side-view display that you see in Google Glass and in the Oakley Airwave snow goggles is, in some sense, a recognition of the fact that no one has solved the transparent display problem, even though there are any number of people working on it. [Such a high-performance see-through display would have] an optical engine for the left and right eyes that would project images into the lenses. The display would be constructed in such a way that it keeps the virtual images in front of you, regardless of where you turn your head or your gaze. It would be a perfect overlay, and you would see right through the projected images. So if you’re doing augmented reality and walking in New York, you will see, “Empire State Building” or “Statue of Liberty” hovering over those physical objects.

There’ve been a handful of companies, such as Lumus, that have had these kinds of technologies for a long time, but typically they’ve either been strictly in development or sold for research in augmented reality. Generally speaking, see-through displays have been too bulky, too heavy and too dim to bring to market. They’re fine indoors, but you walk outside and the virtual image is completely washed out.

Lumus’s see-through, wearable displays resemble a larger-than-normal pair of sunglasses and project augmented-reality images onto the sides of the lenses. As the images travel to the center of the lenses, they are reflected into the eyes, giving the wearer the impression of seeing the images on a large screen in front of them. Why can’t these see-through augmented-reality glasses be made more like a regular pair of eyeglasses or sunglasses?
It really requires a very high level of optical engineering to do it right. You have what Lumus has done. In addition to projections systems, others were going to take compact lasers and project the image directly onto the retina. The people working on these technologies have mostly been small, underfunded start-ups or people who are interested only in the optics and not in actually building a complete [smartglasses] product.

What must Google and other companies building these head-mounted, wearable computers do for their devices to become mainstream?
They need to offer an interface that works the way people naturally move and interact. The use of side screens, like what you’re seeing with Google Glass, is basically the admission that they don’t know how to provide a truly augmented Terminator-like view—where the person wearing the headset can see annotated objects or streaming data while seeing through to the physical world. That’s where everybody wants to be as soon as possible.
The way they’re designed, Google Glass is not at all interesting for gaming or for watching movies. But if you could actually project a high-quality image or game or video in your normal field of view and have it respond instantly to changes in your head position, then telepresence becomes really amazing because to your brain you’re essentially in that space.




- Posted using from my iPad HD

Location:Georgetown TX,United States

No comments:

Post a Comment