Jetliner XR

Taking Flight with Android XR: Building a Jetliner Demo
I spent the last few days exploring Android XR, Google's latest platform for extended reality applications. After working through the documentation and experimenting with the SDK, I built a demo application featuring an A320neo jetliner in a holding pattern. The project served as a practical way to understand the spatial computing capabilities now available to Android developers. In the video below, you’ll see a typical interaction with an XR device to launch the demo app, navigate from Home Space to Full Space mode and a user looking around in the 3D environment.
Android XR: Spatial Computing on Android 🌌
Android XR represents Google's renewed commitment to extended reality for the masses, developed in collaboration with Samsung and Qualcomm. The platform extends the familiar Android development environment into three-dimensional space, allowing developers to build applications that blend digital content with physical environments.
The platform provides three core capabilities that shaped my approach to the jetliner demo:
Developers can spatialize their applications, positioning UI in 3D space rather than constraining them to a traditional 2D screen.
SceneCore enables loading and manipulating 3D models with straightforward APIs.
ARCore integration allows applications to understand and interact with the physical environment around the user.
Samsung's Project Moohan (aka Galaxy XR ?) launches tomorrow on October 21st at the "Worlds Wide Open" event. This marks the first commercial headset running Android XR, providing developers with actual hardware to target compared to the emulator environment currently available.
Prerequisites 🧰
I built the demo on AS Narwhal 4 Feature Drop | 2025.1.4. The emulator used was an Android 14 (“UpsideDownCake”) arm64-v8a, api level 34, google-xr headset with 12GB disk size — see these instructions on how to set it up. These are the versions for the dependencies used for XR:
androidx-compose = { group = "androidx.xr.compose", name = "compose", version = "1.0.0-alpha07" }
androidx-runtime = { group = "androidx.xr.runtime", name = "runtime", version = "1.0.0-alpha06" }
androidx-scenecore = { group = "androidx.xr.scenecore", name = "scenecore", version = "1.0.0-alpha07" }
Jetpack Compose for XR: Extending Familiar Patterns 🎨
What made Android XR approachable was how it leveraged existing Jetpack Compose knowledge. I have been writing Compose UI for a couple of years now, and the XR SDK introduced spatial extensions that felt natural to adopt. The learning curve focused on spatial thinking rather than completely new frameworks. Here are 2 spatial composables and their use in the demo app:
Subspace: This composable creates a partition of 3D space for spatial layouts. Subspaces are only rendered when spatialization is enabled i.e. in Full Space mode. This capability allowed me to structure my application for both traditional and spatial environments without maintaining separate codebases/modules. For example, here’s my main layout:
@Composable
fun MainScreen() {
The3DContent() // Contains a Subspace, thus only rendered in Full Space Mode
The2DContent() // Rendered in Home Space Mode
}
@Composable
fun The3DContent() {
Subspace {
Jetliner()
}
}
Volume: The Volume composable integrates SceneCore entities directly into the Compose hierarchy. This allows me to add and position the 3D jetliner model. The Volume acts as a container for 3D content while maintaining the declarative patterns of Compose.
@Composable
fun Jetliner() {
...
Volume(modifier = modifier) { parent ->
jetlinerEntity.parent = parent
}
}
SpatialPanel: SpatialPanels are used to display 2D content in 3D space. The XR codelabs have a good example of using these panels.
Orbiter: This composable attaches UI elements to SpatialPanels, creating companion interfaces that follow panels through space. The orbiter is the ideal choice for contextual UI that needs to remain accessible regardless of panel position.
There are a couple of other Spatial-prefixed composables available in the SDK for creating immersive experiences e.g. Row, CurvedRow, Column, Box, Dialog, Popup, ExternalSurface.
You can find the complete Jetliner XR project at github.com/charlesmuchene/jetliner-xr, demonstrating how some of these composables work together in a functional application.
The Jetliner’s Holding Pattern Implementation ✈️
The core of the application lives in ui/composables/Jetliner.kt. I loaded an A320neo 3D model — sourced from "A320neo" (https://skfb.ly/oKTUt) by pranav27 on Sketchfab under Creative Commons Attribution. Loading models in android XR uses SceneCore APIs that support glTF resources. After loading, I then positioned the model within a Volume. To demo spatial capabilities, the model needed to move continuously, mimicking a holding pattern: the elliptical flight path that pilots execute when awaiting landing clearance — I watch Mentour Pilot 👨✈️.
@Composable
fun Jetliner() {
val modelEntity = loadModelEntity()
AnimateModelEntity(modelEntity)
Volume(modifier = modifier) { ... }
}
I implemented the movement animation using Jetpack Compose’s animation APIs and SceneCore's entity transform system. The animation updates the Jetliner’s position and rotation on each frame, calculating new coordinates along an elliptical path.

I control the 2 ellipse radii (a, b), the speed of travel (set by the animation time), and the banking angle of the aircraft. The animation loop runs continuously, providing smooth movement visible from any viewing angle. After that, it’s just a matter of fine-tuning these parameters to create an almost realistic holding pattern. Users wearing an XR device can walk around the virtual environment and marvel at the Jetliner’s motion.
@Composable
private fun AnimateModelEntity(modelEntity: GltfModelEntity?) {
val angle = animatedAngle()
val pose = calculatePose(angle)
modelEntity?.setPose(pose)
}
@Composable
private fun animatedAngle(): Float {
// Returns an angle between 0 and 360
// duration: 12s
// easing: Linear
}
private fun calculatePose(angle: Float): Pose {
// Convert to radians
radians = angle × (π / 180)
// Calculate elliptical position using parametric equations
x = radiusX × cos(radians)
z = radiusZ × sin(radians)
// Add vertical oscillation and offset
hover = sin(radians × 6) × 0.2
position = (x, 6 + hover, z - 15)
// Calculate forward direction (tangent to ellipse)
forwardX = -radiusX × sin(radians)
forwardZ = radiusZ × cos(radians)
forward = normalize(forwardX, 0, forwardZ)
// Apply banking angle (varies 15-25° through turn)
bankAngle = 20 + 5 × cos(radians × 2)
up = rotate(UP_VECTOR, around: forward, by: bankAngle)
// Combine into orientation
orientation = lookTowards(forward, up)
Pose(position, orientation)
}
Building for Spatial Computing 🏗️
Android XR lowers the barrier for developers to explore spatial computing. The platform provides first-class tools; Jetpack Compose for XR, Material Design for XR, and SceneCore, built on languages and patterns already familiar to the Android community. This approach differs significantly from traditional XR development, which typically required game engine expertise.
My jetliner demo was a straightforward implementation, but it validated the development experience. The time from concept to working 3D application was compressed compared to what I expected. Official documentation provided a good starting point, and the Compose-based APIs felt intuitive.
Tomorrow's Project Moohan launch transitions Android XR from emulator-based development to physical hardware. This shift matters — testing spatial experiences on actual headsets reveals usability considerations that emulators cannot fully capture. For example, I would like to walk around the environment, experience raycasting, scale models, move panels with my hands etc. The availability of commercial hardware should accelerate both developer experimentation and practical application development.
For developers considering Android XR, the existing Android skillset translates directly. The spatial aspects add a new dimension, literally 😉, but the underlying development patterns remain consistent. Starting with small experiments, like positioning a simple 3D object or creating a floating panel, provides hands-on learning without overwhelming complexity.
Gotchas 🫨
Controlling your perspective in the 3D environment on the emulator takes some time to get used to. There are only 4 controls (interaction, view direction, movement, forward/backward) but I constantly forgot to switch between them to get the control I wanted. To be fair, each control has handy keyboard shortcuts that one can learn to switch between controls effectively.
There’s a non-critical but spammy log from the XR SDK printed for every render frame. You can filter this out from logcat by setting a pattern like to:
-message: "Attempt to remove non-JNI local reference"
2025-10-19 10:48:04.708 com.charlesmuchene.xr W Attempt to remove non-JNI local reference
My Jetliner animation was not as smooth as I expected. I used the code snippet below to measure the animation fps. The measurements in my logs showed anything between 18.0 to 27.0 fps. It could be due to the emulated environment I was running the project on or the size of my Jetliner model. On a dedicated XR hardware, I expect smoother animations for reasonably sized models.
val fpsState = remember { mutableStateOf("Calculating FPS...") } val frameCount = remember { mutableIntStateOf(0) } val lastTime = remember { mutableLongStateOf(System.currentTimeMillis()) } // Log the FPS every second LaunchedEffect(Unit) { while (true) { kotlinx.coroutines.delay(1000) // Wait for 1 second val currentTime = System.currentTimeMillis() val elapsed = (currentTime - lastTime.longValue) / 1000f val fps = frameCount.intValue / elapsed fpsState.value = String.format(Locale.getDefault(), "FPS: %.1f", fps) Log.d("Animation-FPS", fpsState.value) // Reset for the next second lastTime.longValue = currentTime frameCount.intValue = 0 } } // TODO: Increment frame count on each calculation of the pose
Conclusion 🚀
The platform is ready. The tools are in alpha. The first XR consumer device launches tomorrow. For those interested in spatial computing, this represents an accessible entry point built on established technologies. What will you build?
Happy coding! 😎
Resources: Android XR documentation | Jetliner XR project on GitHub




