With each step forward in the digital revolution, "The Matrix" becomes less like fiction and more like reality. That's in part because hardware engineers and software developers continue to refine their augmented reality technologies, making the line between real and virtual life ever blurrier, no matter how much Keanu Reeves squints in an effort to differentiate the two. Augmented reality (AR), it seems, may soon be the true reality for us all.
Augmented reality is the blending of interactive digital elements – like dazzling visual overlays, buzzy haptic feedback, or other sensory projections – into our real-world environments. This mobile game allowed users to view the world around them through their smartphone cameras while projecting game items, including onscreen icons, score, and ever-elusive Pokemon creatures, as overlays that made them seem as if those items were right in your real-life neighborhood. The game's design was so immersive that it sent millions of kids and adults alike walking (and absentmindedly stumbling) through their real-world backyards in search of virtual prizes.
Google SkyMap is another well-known AR app. It overlays information about constellations, planets and more as you point the camera of your smartphone or tablet toward the heavens. Wikitude is an app that looks up information about a landmark or object by your simply pointing at it using your smartphone's camera.The IKEA Place app will provide an overlay of a new couch for that space before you buy it so that you can make sure it fits.
But AR is more than just smartphone fun. It's a technology that finds uses in more serious matters, from business to warfare to medicine.
The U.S. Army, for example, uses AR tools to create digitally enhanced training missions for soldiers. It's become such a prevalent concept that the army's given one program an official name, Synthetic Training Environment, or STE. Wearable AR glasses and headsets may well help futuristic armies process data overload at incredible speeds, helping commanders make better battlefield decisions on the fly. There are fascinating business benefits, too. The Gatwick passenger app, for example helps travelers navigate the insanity of a packed airport using its AR app.
The possibilities of AR tech are limitless. The only uncertainty is how smoothly, and quickly, developers will integrate these capabilities into devices that we'll use on a daily basis.
Augmenting Our World
The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. Sounds pretty simple. Besides, haven't television networks been doing that with graphics for decades? However, augmented reality is more advanced than any technology you've seen in television broadcasts, although some new TV effects come close, such as RACEf/x and the super-imposed first down line on televised U.S. football games, both created by Sportvision. But these systems display graphics for only one point of view. Next-generation augmented-reality systems will display graphics for each viewer's perspective.
Some of the most exciting augmented-reality work began taking place in research labs at universities around the world. In February 2009, technophiles at the TED conference were all atwitter because Pattie Maes and Pranav Mistry presented a groundbreaking augmented-reality system, which they developed as part of MIT Media Lab's Fluid Interfaces Group. They called it SixthSense, and although the project is stalled, it's a good overview of how you'll find basic components that are found in many augmented reality systems:
Camera
Small projector
Smartphone
Mirror
These components were strung together in a lanyard-like apparatus that the user wore around his neck. The user also wore four colored caps on the fingers, and these caps were used to manipulate the images that the projector emitted.
SixthSense was remarkable because it used these simple, off-the-shelf components that cost around $350. It was also notable because the projector essentially turned any surface into an interactive screen. Essentially, the device worked by using the camera and mirror to examine the surrounding world, feeding that image to the phone (which processed the image, gathered GPS coordinates and pulled data from the Internet), and then projected information from the projector onto the surface in front of the user, whether a wrist, a wall, or even a person. Because the user was wearing the camera on his chest, SixthSense augmented whatever he looked at; for example, if he picked up a can of soup in a grocery store, SixthSense found and projected onto the soup information about its ingredients, price, nutritional value — even customer reviews.
By using his capped fingers — Pattie Maes said even fingers with different colors of nail polish would work — a user could perform actions on the projected information, which were then picked up by the camera and processed by the phone. If he wanted to know more about that can of soup than was projected on it, he could use his fingers to interact with the projected image and learn about, say, competing brands. Sadly, the SixthSense project went into a years-long hiatus and will probably never reach markets. But there are many other products stepping into the AR fray .
Small projector
Smartphone
Mirror
These components were strung together in a lanyard-like apparatus that the user wore around his neck. The user also wore four colored caps on the fingers, and these caps were used to manipulate the images that the projector emitted.
SixthSense was remarkable because it used these simple, off-the-shelf components that cost around $350. It was also notable because the projector essentially turned any surface into an interactive screen. Essentially, the device worked by using the camera and mirror to examine the surrounding world, feeding that image to the phone (which processed the image, gathered GPS coordinates and pulled data from the Internet), and then projected information from the projector onto the surface in front of the user, whether a wrist, a wall, or even a person. Because the user was wearing the camera on his chest, SixthSense augmented whatever he looked at; for example, if he picked up a can of soup in a grocery store, SixthSense found and projected onto the soup information about its ingredients, price, nutritional value — even customer reviews.
By using his capped fingers — Pattie Maes said even fingers with different colors of nail polish would work — a user could perform actions on the projected information, which were then picked up by the camera and processed by the phone. If he wanted to know more about that can of soup than was projected on it, he could use his fingers to interact with the projected image and learn about, say, competing brands. Sadly, the SixthSense project went into a years-long hiatus and will probably never reach markets. But there are many other products stepping into the AR fray .
Comments
Post a Comment