Augmented Reality November 23, 2017

Apple's Roadmap for AR (part 2)

Kenny Deriemaeker

AR/VR Competence Lead

AR memory loss 

In part 1 of this blogpost, we took a look at mobile AR on our smartphones today, and how Apple’s future vision of AR is a perfectly intuitive, immersive headset. There’s still quite a gap to fill before we're there. A memory gap, to be more specific.

In 2017, Apple released ARKit, its framework to create mobile AR apps for iPhone and iPad. You might have tried apps that were built with it, such as Ikea Place and MeasureKit. Both those apps reveal one glaring hole in ARKit: its lack of a long-term memory.  

ARKit can map your surroundings and keep virtual elements fixed in it, but closing or just backgrounding the app causes its map of the real world to be lost — so you have to start over each time. If it could remember and recognise exactly where you are across sessions, a lot of new use cases would become possible. Because not only would your virtual furniture still be where you left it the day before, you could also share it with others!


Realtime, multi-user AR

Realtime, multi-user AR is unexplored territory, but promises very compelling new use cases for all kinds of apps and games.

If both your phone and your friend’s phone have a map of a certain environment, you can use it to localise each other within it and you could both track and manipulate the same content in AR. You could both drag virtual furniture around in real time, for instance. Or you could play a multiplayer game together, each with a different viewpoint. The list of possibilities goes on. 

Realtime, multi-user AR is unexplored territory, but it promises very compelling new use cases for all kinds of apps (and games). Unfortunately, it is also a very hard technical problem to solve.


The point cloud

Google came up with their solution a few years ago in Project Tango, their early work with mobile AR. It was years ahead of everyone else, but relied on bulky and expensive hardware that never became mainstream. Tango was discontinued, and its core systems — those that don’t require specialised sensor hardware — were rolled into ARCore.

Tango used a technique they call Area Learning. To estimate its movement in physical space, your device tracks features in the environment to create a virtual point cloud. You can visualise this as hundreds of virtual dots on the walls, floors and objects in your room - not quite detailed enough to create a 3D model, but enough to use as a rough map. 

ARKits keep a small point cloud in memory, but Project Tango sees things bigger: it records and stores large point clouds in a visual database of an area, so it can remember what that place looks like and quickly recognise it later. So, with Tango you can put a virtual couch down in your living room and find it in the exact same place a week later, or walk into a supermarket and get precise indoor navigation to your favourite shampoo. 


Google's Visual Positioning System

Mapping and remembering places in AR doesn’t necessarily require special hardware, but it does require smart algorithms, fast processing and a lot of (cloud) storage. Google is working on their Visual Positioning System, a map of the entire world (indoors and outdoors) in a network of point clouds that is comparable to Google Maps. Apple’s long-term plans undoubtedly involve something similar, though maybe not quite as ambitious at first. 

Once this “AR cloud” is in place to map and recognise places in the world, our apps and platforms can start associating AR content with very precise real-world locations. Augmented reality experiences can then be persistent and shared with others in realtime, linking the physical world with the digital world like never before. 

Add machine learning, object and face recognition and the Internet of Things to the mix and AR’s capabilities suddenly expand dramatically, all without needing to upgrade your phone.


A whole new AR world

Working out the algorithms and putting the infrastructure in place to realise this vision will take time and effort, and some tech companies will be better-equipped to do it than others. 

But where Apple undoubtedly excels is end-to-end product design, vertical integration and user experience — skills that will be instrumental in taking augmented reality beyond our smartphone and tablet screens.

Apple's immersive and intelligent AR headset is still at least a few years away. Will this wearable be the revolution that replaces the iPhone? It might just be. But one thing's for sure: our future reality will be augmented.



AR Discovery Sprint

In just a few weeks' time, we pinpoint the most beneficial AR opportunity for your business, build a prototype and test its viability. By the end, you'll be ready to move forward with AR. 

Discover VR
Discover VR