What is Remote Compose, and how can you leverage it to build server-driven UI
What is Remote Compose, and how can you leverage it to build server-driven UI
Building dynamic user interfaces has long been a fundamental challenge in Android development. The traditional approach requires recompiling and redeploying the entire application whenever the UI needs to change—a process that creates significant friction for A/B testing, feature flags, and real-time content updates. Consider a scenario where your marketing team wants to test a new checkout button design: in the traditional model, this simple change requires developer time, code review, QA testing, app store submission, and weeks of waiting for user adoption. Compose Remote emerges as a powerful solution to this problem, enabling developers to create, transmit, and render Jetpack Compose UI layouts at runtime without any recompilation.
In this article, you'll explore what Compose Remote is, understand its core architecture, and discover the benefits it brings to dynamic screen design with Jetpack Compose. This isn't a tutorial on using the library, it's an exploration of the paradigm shift it represents for Android UI development.
Understanding the core abstraction: What makes Compose Remote special
At its heart, Compose Remote is a framework that enables remote rendering of Compose UI components. What distinguishes it from traditional UI approaches is its adherence to two fundamental principles: declarative document serialization and platform-independent rendering.
Declarative document serialization
Declarative document serialization means you can capture any Jetpack Compose layout into a compact, serialized format. Think of it like taking a "screenshot" of your UI, except instead of pixels, you're capturing the actual drawing instructions. This captured document contains everything needed to recreate the UI: shapes, colors, text, images, animations, and even interactive touch regions.
// On the server or creation side
val document = captureRemoteDocument(
context = context,
creationDisplayInfo = displayInfo,
profile = profile
) {
// Standard Compose UI - looks exactly like regular Compose code
Column(modifier = RemoteModifier.fillMaxSize()) {
Text("Dynamic Content")
Button(onClick = { /* action */ }) {
Text("Click Me")
}
}
}
// Result: A ByteArray that can be sent over the network
The cool of this approach is that the creation side writes standard Compose code. There's no new DSL to learn, no JSON schema to maintain, no template language to master. If you can write it in Compose, you can capture it with Compose Remote.
Platform-independent rendering
Platform-independent rendering means the captured document can be transmitted over the network and rendered on any Android device without needing the original Compose code. The client device doesn't need your composable functions, your view models, or your business logic, it just needs the document bytes and a player.
// On the client or player side
RemoteDocumentPlayer(
document = remoteDocument.document,
documentWidth = windowInfo.containerSize.width,
documentHeight = windowInfo.containerSize.height,
onAction = { actionId, value ->
// Handle user interactions
}
)
These properties aren't just conveniences, they're architectural constraints that enable true decoupling of UI definition from deployment. The document format captures not just static layouts but also state, animations, and interactions, making it a complete representation of the UI experience.
Comparing approaches: Why not JSON or WebViews?
Before diving deeper, it's worth understanding why Compose Remote takes this approach rather than alternatives:
JSON-based server-driven UI (like Airbnb's Epoxy or Shopify's approach) requires defining a schema that maps to native components. This works well for structured content but struggles with:
- Complex animations and transitions
- Custom drawing and graphics
- Rich text with inline styling
- Gradients, shadows, and visual effects
WebViews offer full flexibility but introduce:
- Performance overhead (separate rendering process)
- Inconsistent look and feel (web styling vs native)
- Memory pressure (each WebView is expensive)
- Touch handling complexity (gesture conflicts)
Compose Remote takes a third path: capturing the actual drawing operations that Compose would execute. This means any UI you can build in Compose, including custom Canvas drawing, complex animations, and Material Design components, can be captured and replayed remotely with native performance.
The document-based architecture: Creation and playback
Compose Remote's architecture is built around a clear separation between two phases: document creation and document playback. Understanding this separation is key to understanding the framework's power.
Document creation: Capturing UI as data
The creation phase transforms Compose UI code into a serialized document. This happens through a sophisticated capture mechanism that intercepts drawing operations at the Canvas level, the lowest level of Android's rendering pipeline.
@Composable Content
↓
RemoteComposeCreationState (Tracks state and modifiers)
↓
CaptureComposeView (Virtual Display - no actual screen needed)
↓
RecordingCanvas (Intercepts every draw call)
↓
Operations (93+ operation types covering all drawing primitives)
↓
RemoteComposeBuffer (Efficient binary serialization)
↓
ByteArray (Network-ready, typically 10-100KB for complex UIs)
The creation side provides a complete Compose integration layer. You write standard @Composable functions, and the framework captures everything: layout hierarchies, modifiers, text styles, images, animations, and even touch handlers.
What makes this special is that the captured document is self-contained. It includes all the information needed to recreate the UI:
- Visual elements: Shapes, colors, gradients, shadows
- Text: Strings, fonts, sizes, styling
- Images: Embedded bitmaps or URLs for lazy loading
- Layout: Sizes, positions, padding, alignment
- Interactions: Touch areas, click handlers, named actions
- State: Variables that can be updated at runtime
- Animations: Time-based expressions for motion
The receiver doesn't need access to your codebase, just the document bytes. This is fundamentally different from other server-driven UI approaches where the client needs to understand a schema or have pre-built components.
Document playback: Rendering without compilation
The playback phase takes a serialized document and renders it to the screen. The player iterates through operations, executing each one against a Canvas. It's conceptually similar to how a video player decodes frames, except instead of pixels, we're decoding drawing instructions.
Compose Remote provides two rendering backends to fit different architectural needs:
Compose-based player (recommended for modern apps):
@Composable
fun DynamicScreen(document: CoreDocument) {
RemoteDocumentPlayer(
document = document,
documentWidth = screenWidth,
documentHeight = screenHeight,
modifier = Modifier.fillMaxSize(),
onNamedAction = { name, value, stateUpdater ->
// Handle named actions from the document
when (name) {
"addToCart" -> cartManager.addItem(value)
"navigate" -> navController.navigate(value)
"trackEvent" -> analytics.logEvent(value)
}
},
bitmapLoader = rememberBitmapLoader() // For lazy image loading
)
}
The Compose-based player integrates naturally with your existing Compose UI. It's a composable function that you can place anywhere in your composition hierarchy, apply modifiers to, and animate like any other composable.
View-based player (for compatibility with existing View hierarchies):
class LegacyActivity : AppCompatActivity() {
private lateinit var player: RemoteComposePlayer
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
player = RemoteComposePlayer(this)
setContentView(player)
// Load document from network
lifecycleScope.launch {
val bytes = api.fetchDocument("home-screen")
player.setDocument(bytes)
}
player.onNamedAction { name, value, stateUpdater ->
// Handle actions
}
}
}
Both players provide identical rendering fidelity, the choice depends on your app's architecture. If you're fully Compose, use the composable player. If you're migrating from Views or embedding in a View hierarchy, use the View-based player.
The operation model: A comprehensive drawing vocabulary
The great of Compose Remote lies in its comprehensive operation model. The framework defines 93+ distinct operations that cover every aspect of UI rendering. This isn't an arbitrary number, it's the complete vocabulary needed to express any Canvas drawing operation.
Why operations matter
Traditional server-driven UI sends high-level component descriptions: "render a button with text 'Submit'". The client must then interpret this and map it to a native component. This creates tight coupling between server and client—both must agree on what a "button" is and how it behaves.
Compose Remote operates at a lower level: instead of "render a button", it sends the actual drawing instructions: "draw a rounded rectangle at these coordinates with this color, then draw text 'Submit' at this position with this font". The client doesn't need to know what a "button" is, it just executes drawing operations.
This low-level approach has profound implications:
- No schema synchronization: Server and client don't need to agree on component definitions
- Full visual fidelity: Any visual effect possible in Compose is capturable
- Forward compatibility: New visual designs work on old clients (they're just different drawing operations)
- Custom components: Your custom composables work automatically, no registration needed
Drawing operations
Drawing operations capture Canvas draw calls—the fundamental primitives of 2D graphics:
DRAW_RECT - Rectangles (buttons, cards, backgrounds)
DRAW_ROUND_RECT - Rounded rectangles (Material surfaces)
DRAW_CIRCLE - Circles (avatars, indicators)
DRAW_OVAL - Ovals and ellipses
DRAW_LINE - Lines and dividers
DRAW_PATH - Arbitrary shapes (icons, custom graphics)
DRAW_ARC - Arcs (progress indicators, pie charts)
DRAW_SECTOR - Pie sectors
DRAW_TEXT - Text rendering with full styling
DRAW_TEXT_ON_PATH - Text along curves
DRAW_BITMAP - Images
DRAW_TWEEN_PATH - Animated path morphing
This article continues for subscribers
Subscribe to Dove Letter for full access to 40+ deep-dive articles about Android and Kotlin development.
Become a Sponsor