The announcement of iOS12 brought significant enhancements to Apple’s AR initiative. There were new iPhones but no major surprises at the September 12th event (you weren’t expecting AR Glasses this year, were you?), but everything points to Augmented Reality being critically important in Cupertino.
With ARKit 2 announced back in June at WWDC and iOS12 coming later this month, the real game changer is in the technology. Apple’s new A12 bionic chip will run up to nine times faster, bring about real-time machine learning, and deeply immersive AR experiences. Remarkably, it takes the record as the world’s first seven-nanometer processor.
Yes, we hear it all the time – it’s always how we use the technology, not the device itself. But the impact on us is predicated on the tech being there. We’re not at the introduction of the first iPhone moment for augmented reality yet. But we’re getting seriously close.
New Apple AR developments
The most exciting Apple AR developments are in the shared experiences and in object recognition and persistence. These are just the foundations of the ways AR will impact society. You can see immediate applications in education, work, and online shopping, but don’t let your imagination get stuck on the obvious.
As Apple described ARKit 2 back at WWDC this summer,
Shared experiences with ARKit 2 make AR even more engaging on iPhone and iPad, allowing multiple users to play a game or collaborate on projects like home renovations. Developers can also add a spectator mode, giving friends and family the best views of AR gameplay from a different iOS device.
Games first, learning will follow. Years ago, our chalkboard gave way to smart boards. Not that many years from now, we’ll be strategizing how to toss out our smart displays for multiuser experiences through AR in the classroom (though hopefully, we can stop using that latter term).
We’re only at the early stage demos now but they’re interesting nonetheless. Apple showed off the arcade game Galaga, redesigned with ARKit as a multiplayer augmented reality game. Four people can compete with each other using only their iPhones (it’s unclear if it requires wifi).
Here’s the short on-stage demo of Galaga on September 12:
Okay, it’s just a game. But when you have that kind of precision in object movement and human interaction, a whole new world of possibilities opens up.
AR “Quick Look”
More fascinating is the Apple AR “Quick Look” feature. Here’s the description from UpLoadVR.
One of the most interesting aspects of the update is a new feature called “Quick Look” . . . [which allows] a 3D object to be pulled out of a web page and placed in the real world.
While developers can use this feature too, its integration into the Safari Web browser means developers may be able to take advantage of AR on iPhone without needing to release an app on Apple’s App Store.
That could be a breakthrough feature for AR in the learning environment – sharing resources without having to create an app. Of course, it would have to be cross-browser compatible. And the objects need to go far beyond what you’ll find in Ikea or other furnishing stores.
But this is just the beginning. Today, pull an AR object out of a webpage. Tomorrow, we’ll pull an entire environment out of a page. And step into it with our colleagues and friends.
And that’s when it gets truly interesting.
Emory Craig is a writer, speaker, and VR consultant with extensive experience in art, new media, and higher education. He speaks at global conferences on innovation, education, and ethical technology in the future. He has published widely and worked with the US Agency for International Development, the United Nations, and the Organization for Economic Co-operation and Development (OECD). Living at the intersection of learning, games, and immersive storytelling, he is fascinated by AI-based avatars, digital twins, and the ethical implications of blurring the boundaries between the real and the virtual.