We are excited to announce that we now support for the Meta 2 Augmented Reality headset.

In addition to offering the core analytics suite we offer to all VR / AR devices, we have also created a new feature around use of spatial sensors available on the Meta 2. This feature captures real time spatial context about the AR session. This enables our customers to get a full report on how their digital experiences are being used in relation to the real world.

The initial users of this feature would be in the training and simulation space – where operators would be able to map training sessions to the real world environments that their users/employees are seeing.

Here is an image of the feature:

As you can see from this image, we have ingested the live, colored spatial data and built the users’ environment into our SceneExplorer session viewer. Immediately, it is clear that the digital car we spawned into this user’s field of view is actually blocked by a wall in his physical space, creating instant feedback for a developer or operator.

Additionally, we have sorted spatial data into two buckets, *temporary* and *permanent*. This allows us to differentiate from momentary updates in a users’ spatial context such as hand gestures or other people walking by, versus permanent spatial fixtures such as a table or wall.

This spatial data gets more dense as the session time increases, and we will eventually start building out a 3d mesh from this data to give even more clear context to augmented or mixed reality sessions. Here is an example of a mesh we have created from a user session:

In addition to building a mesh, in the future we could transfer the point cloud data we have into vertex colors, therefore creating a fully lifelike recreation of a user’s spatial context.

We imagine this will be incredibly useful for our existing enterprise customers who are using our technology for training, simulations, or retail testing. As we roll out more features, we predict that a user’s spatial context will be a vital layer of information in determining efficacy of an AR or MR session.

If you would like to learn more about or tooling, or would like to request a demo, please contact us on our contact page.

← Back to the blog.