Introducing Cognitive3D Analytics for Android XR: Spatial Insight for Immersive Experiences Built on Android




Introducing Cognitive3D Analytics for Android XR: Spatial Insight for Immersive Experiences Built on Android
We’re excited to announce the first generally available release of the Cognitive3D analytics SDK for Android XR, bringing powerful spatial analytics to the next generation of Android-based XR devices.
As Android XR opens the door to new headsets and form factors, teams need better ways to understand how people actually use spatial applications. This release makes it possible to measure attention, interaction, and behaviour inside native Android XR experiences, using the same analytics principles Cognitive3D is known for across XR platforms.
What can you do with the Android XR SDK?
On a broad level, the Android XR SDK lets you understand user behaviour in 3D space, not just traditional button clicks or screen taps.
Here’s what that looks like in practice.
Capturing Meaningful Behaviour in Android XR
At the heart of the Android XR SDK are the flexible ways we capture what’s happening inside immersive experiences: custom session properties, custom events, custom sensors, and dynamic objects.
Together, they give teams a complete picture of user behaviour in 3D space.
Custom Session & Participant Properties: Understanding Who & What
Custom session and participant properties provide persistent context for every XR session.
Participant properties describe who the user is, such as their role, experience level, or organization.
Session properties describe what kind of experience they’re in, such as the app mode, workspace, or collaboration setup.
Together, these properties create a foundation that applies across the entire session, so every interaction, trend, and object engagement can be viewed in the right context.
This makes it possible to answer questions like:
- Do different user roles behave differently?
- How does collaboration change engagement?
- Which environments or modes lead to better outcomes?
By separating who the user is from what experience they’re in, teams gain clarity without duplicating data across every interaction.
Custom Events: Tracking Important Moments
Custom events are used to capture the actions and milestones that matter most to your product: key moments inside an Android XR experience.
These might include:
- Using a specific tool or feature
- Interacting with a 3D object
- Completing a task or workflow
- Sharing content with others
- Encountering an error or friction point
What makes these events powerful is that they can include additional context, such as what was interacted with, how it was used, or the state of the experience at that moment. This allows teams to move beyond simple usage counts and understand how and why interactions happen.
In short, custom events help answer questions like:
- Which features are actually being used?
- Where do users struggle or drop off?
- What actions lead to successful outcomes?
Custom Sensors: Understanding Trends Over Time
Not all insights come from single moments. Some behaviours only become clear when you look at how things change over time.
Custom sensors are designed for tracking these ongoing signals, such as:
- How complex a workspace becomes during a session
- How many objects or participants are present over time
- How active or engaged a session is from start to finish
- Environmental or performance-related metrics
By capturing these trends, teams can see patterns that aren’t visible from events alone. For example, whether increasing complexity leads to disengagement, or how long users remain productive in immersive environments.
Custom sensors turn raw activity into timelines of behaviour, helping teams understand the full arc of an XR session.
Dynamic Objects: Analyzing Interaction with 3D Content
Dynamic objects are used to track interactive 3D content inside an Android XR experience: the models, tools, or assets that users move around, inspect, and engage with.
This makes it possible to understand:
- Which objects attract the most attention
- How users move around and interact with content
- Which 3D assets are central to the experience and which are ignored
- How object placement or scale affects engagement
Dynamic object tracking is especially valuable in use cases like training, design review, data visualization, and collaborative workspaces, where 3D content is the core of the experience.
By analyzing object-level interaction, teams gain insights that simply aren’t possible with traditional analytics.
A Complete Picture of Spatial Behaviour
When used together:
- Custom participant and session properties provide context about who the user is and what experience they are in
- Custom events capture what happens during an experience
- Custom sensors show how behaviour changes over time
- Dynamic objects reveal how users engage with 3D content
This combination gives teams a deeper, more accurate understanding of how people use Android XR applications and enables better design decisions, more effective features, and improved immersive experiences.
Support Collaborative and Spatial Workflows
The SDK is well-suited for multi-user and shared spatial experiences, such as collaborative workspaces.
Teams can analyze:
- How users collaborate in shared spaces
- Which tools are used most during collaboration
- How group dynamics affect engagement and outcomes
This helps product teams refine both UX and collaboration features based on real usage.
Designed for Android XR Developers
This release is built specifically for native Android XR workflows, giving developers a way to add analytics without disrupting performance or immersion.
It also fits naturally into Android XR pipelines, supporting modern spatial apps as the Android XR ecosystem continues to grow.
Why This Matters
As XR experiences become more complex, traditional analytics fall short. The Android XR SDK gives teams a way to:
- Validate design decisions with real user behaviour
- Improve usability and comfort in spatial apps
- Measure the success of immersive features
- Make data-informed decisions as Android XR evolves
This first release marks an important step in bringing true spatial analytics to Android XR and we’re excited to see what developers build with it.