Augmented Reality is an emerging technology that will have a profound impact on how we engage with the physical and virtual world. I’ve written this blog to give educators some more background on this trend. Here is a quick visual example of the capabilities that emerge from this technology. In this short video you will see virtual dominoes placed on a real object using a touch screen. Next the user “touches” the physical space and the virtual dominoes fall down.
Defining Augmented Reality
I define Augmented Reality or “AR” as a set of technologies that integrate the physical world with digital information to create a enhanced and unified experience via a seamless user interface. Consider apps that combine the smartphone viewfinder, GPS and camera to augment perception of the physical world. An example of this is Blackboard’s Explorer for iPhone app or Wikitude.
There is more to this than just hacking the smartphone viewfinder. To better explain this technology I’ve grouped it into three dimensions: X, Y and Z coordinates.
The X Axis: From 5 Senses to 5000
Consider the impact real time computer sensory data on top of our own biological senses. We can provide the user with enhanced perception, continuous biological monitoring, improved motion and location services. As mentioned above AR Apps like Wikitude and Blackboard Explorer leverage a smartphone, viewfinder and GPS to enhance the visual experience. The form factor of looking through the smart phone viewfinder is still a bit awkward. This fall (2013) expect to see devices like Google’s Project Glass provide an integrated “heads up display” for early adopters. The number of sensors being integrated into our phones and apps is enormous. My Nike’s talk to my phone while I run. I’ve got a Neurosky headset that can give me a real time EEG readout while I mediate. Apps are helping us sleep better by monitoring how restless we are in the bed. Near Field Communications chips and QR Codes are creating cheap infrastructure for hyperlinks between the real world as virtual.
The Y Axis: Motion Capture Based Interfaces
Gestures are already taking over from the mouse and keyboard on tablets and smartphones. Gestures captured via cameras on our computing devices using gadgets like XBOX Kinect and the Leap Motion Controller are the next wave. There are already really cool classroom activities and lessons built around Kinect. Leap Motion announced at the Consumer Electronics Show in January that a number of manufactures will have their device built into high end laptops and gaming PC’s later this year. This technology is important because it simplifies the user experience and makes it easy for developers to merge physical and virtual space. As seen in the domino example above this synthesis creates interesting possibilities for simulation and interaction.
The Z Axis: 3D Projection Maps
Capturing motion and augmenting perception are increasingly combined with digital projectors to create 3D projection maps. Already this tech is appearing at concerts and stage performances. These technologies enable a digital projector to generate geometric shapes that are placed on real objects on a stage. Multiple projectors can be linked to create some awesome displays.
Look at this performance by Dandypunk called, “The Alchemy of Light” for a really stunning example of an interactive performance using 3D projection maps.
Educators should be excited about the new kinds of immersive learning simulations that will appear as this technology goes mainstream. I encourage more DIY/maker minded educators to look for ways to get students building projects that leverage these devices and systems. Most of these technologies have robust online communities and open source tools to help get you started.