Unlocking AR collaboration by creating consistent virtual environments
Most of today's AR experiences are limited to a single participant. Making AR social and collaborative is challenging due to inconsistent tracking of physical spaces. For example, if one participant places a virtual vase on a physical table, that vase may appear to float in midair to other participants.
While there are approaches which attempt to compare individual tracking information to reach a consensus, the process can be slow and error-prone. With AR Together, we use sensors on modern mobile devices combined with sensor-fusion algorithms in order to predict other perspectives. Such relative location information enables multiple devices to reach a consensus on the tracking of physical spaces faster and with more precision, resulting in a more seamless multi-player collaborative AR experience.