Note: Start with the basics. Check out our primer on augmented reality here.
What if augmented reality experiences on your phone could interact with objects in physical space? What if you could do more in AR than just look at 3D models as they “augment” your surroundings? And what if interacting with those 3D models could unlock an experience in real life, or IRL?
With a few AR apps under our belt – see our Instagallery portal, secret menu, and Washington Capitals demo – we wanted to push the limits. We love the potential of mobile AR, but hated that it required faces glued to screens.
We set out with the goal of making an AR experience that was mobile-based, but also allowed for interaction with the user’s surroundings. Thus, AR-IRL was born.
Built using Apple’s ARKit, a Particle Photon microcontroller, and the blood, sweat, and tears of our hardware team, AR-IRL is a functional prototype of a locked box, opened only with the virtual keypad that appears in the AR app (…unless you cut the power to the electromagnetic lock. Side note: we don’t recommend storing anything valuable in this box).
This prototype demonstrates that mobile AR can be extended to become more immersive and involve the user’s surroundings. A simple interaction can (in this case, quite literally) unlock a whole new experience in the real world.
An important detail to note – we were able to take the interaction in both directions. When we interacted with the AR keypad, it triggered an effect in real life (the box unlocking) and when we interacted with the physical device, it triggered an effect in AR (as you closed the box, the keypad reappeared).
Additionally, this prototype is only one iteration. We locked and unlocked a box using an electromagnet, but the possibilities are endless. We can control lights, motors, haptics, projections, and so much more. For example, here is an early proof of concept in which a real LED ring turns on and off with the press of a virtual button.
Looking ahead, we’re excited about how we can push this idea forward. With the new updates to ARKit and ARCore coming soon, we want to turn this into a multi-user experience. We also want to remove the dependence on the QR code by integrating machine learning techniques for image and object detection. That way, the entry into the experience, which is currently the QR code, can be more natural.
Using this technique, we can extend immersive AR to anyone with a smartphone. While there are exciting new sensors, head mounted displays, and other VR equipment in development, their widespread adoption just isn’t there yet. However, millions upon millions of people have a smartphone. By creating experiences like AR-IRL, we can begin to bring immersive, interactive, and memorable mobile AR to the masses (and to our clients’ customers).
Interested in more detail on how we made this? Take a look at the code and electronics on Github. Do you have an interesting idea for how we can use this technology? Let’s discuss! Email me at firstname.lastname@example.org.