The Android bot is a beloved mascot for Android customers and builders, with earlier variations of the bot builder being extremely popular – we determined that this 12 months we’d rebuild the bot maker from the bottom up, utilizing the newest expertise backed by Gemini. At this time we’re releasing a brand new open supply app, Androidify, for studying the way to construct highly effective AI pushed experiences on Android utilizing the newest applied sciences resembling Jetpack Compose, Gemini by means of Firebase, CameraX, and Navigation 3.
Right here’s an instance of the app operating on the machine, showcasing changing a photograph to an Android bot that represents my likeness:
Beneath the hood
The app combines quite a lot of completely different Google applied sciences, resembling:
- Gemini API – by means of Firebase AI Logic SDK, for accessing the underlying Imagen and Gemini fashions.
- Jetpack Compose – for constructing the UI with pleasant animations and making the app adapt to completely different display sizes.
- Navigation 3 – the newest navigation library for increase Navigation graphs with Compose.
- CameraX Compose and Media3 Compose – for increase a customized digital camera with customized UI controls (rear digital camera assist, zoom assist, tap-to-focus) and taking part in the promotional video.
This pattern app is at present utilizing a normal Imagen mannequin, however we have been engaged on a fine-tuned mannequin that is skilled particularly on the entire items that make the Android bot cute and enjoyable; we’ll share that model later this 12 months. Within the meantime, do not be shocked if the pattern app places out some attention-grabbing trying examples!
How does the Androidify app work?
The app leverages our greatest practices for Structure, Testing, and UI to showcase an actual world, trendy AI utility on machine.
![]()
Androidify app circulate chart detailing how the app works with AI
AI in Androidify with Gemini and ML Package
The Androidify app makes use of the Gemini fashions in a mess of how to complement the app expertise, all powered by the Firebase AI Logic SDK. The app makes use of Gemini 2.5 Flash and Imagen 3 below the hood:
- Picture validation: We be certain that the captured picture incorporates adequate info, resembling a clearly centered particular person, and assessing for security. This characteristic makes use of the multi-modal capabilities of Gemini API, by giving it a immediate and picture on the identical time:
val response = generativeModel.generateContent(
content material {
textual content(immediate)
picture(picture)
},
)
- Textual content immediate validation: If the consumer opts for textual content enter as an alternative of picture, we use Gemini 2.5 Flash to make sure the textual content incorporates a sufficiently descriptive immediate to generate a bot.
- Picture captioning: As soon as we’re certain the picture has sufficient info, we use Gemini 2.5 Flash to carry out picture captioning., We ask Gemini to be as descriptive as doable,specializing in the clothes and its colours.
- “Assist me write” characteristic: Much like an “I’m feeling fortunate” sort characteristic, “Assist me write” makes use of Gemini 2.5 Flash to create a random description of the clothes and coiffure of a bot.
- Picture technology from the generated immediate: As the ultimate step, Imagen generates the picture, offering the immediate and the chosen pores and skin tone of the bot.
The app additionally makes use of the ML Package pose detection to detect an individual within the viewfinder and allow the seize button when an individual is detected, in addition to including enjoyable indicators across the content material to point detection.
Discover extra detailed details about AI utilization in Androidify.
Jetpack Compose
The consumer interface of Androidify is constructed utilizing Jetpack Compose, the trendy UI toolkit that simplifies and accelerates UI improvement on Android.
Pleasant particulars with the UI
The app makes use of Materials 3 Expressive, the newest alpha launch that makes your apps extra premium, fascinating, and fascinating. It gives pleasant bits of UI out-of-the-box, like new shapes, componentry, and utilizing the MotionScheme variables wherever a movement spec is required.
MaterialShapes are utilized in numerous areas. These are a preset record of shapes that permit for straightforward morphing between one another—for instance, the lovable cookie form for the digital camera seize button:
![]()
Digicam button with a MaterialShapes.Cookie9Sided form
Past utilizing the usual Materials parts, Androidify additionally options customized composables and pleasant transitions tailor-made to the precise wants of the app:
- There are many shared ingredient transitions throughout the app—for instance, a morphing form shared ingredient transition is carried out between the “take a photograph” button and the digital camera floor.
![]()
- Customized enter transitions for the ResultsScreen with the utilization of marquee modifiers.
![]()
- Enjoyable coloration splash animation as a transition between screens.
![]()
- Animating gradient buttons for the AI-powered actions.
To be taught extra concerning the distinctive particulars of the UI, learn Androidify: Constructing pleasant UIs with Compose
Adapting to completely different gadgets
Androidify is designed to look nice and performance seamlessly throughout sweet bar telephones, foldables, and tablets. The final purpose of creating adaptive apps is to keep away from reimplementing the identical app a number of instances on every kind issue by extracting out reusable composables, and leveraging APIs like WindowSizeClass to find out what sort of structure to show.
![]()
Numerous adaptive layouts within the app
For Androidify, we solely wanted to leverage the width window dimension class. Combining this with completely different structure mechanisms, we had been capable of reuse or prolong the composables to cater to the multitude of various machine sizes and capabilities.
- Responsive layouts: The CreationScreen demonstrates adaptive design. It makes use of helper capabilities like isAtLeastMedium() to detect window dimension classes and alter its structure accordingly. On bigger home windows, the picture/immediate space and coloration picker may sit side-by-side in a Row, whereas on smaller home windows, the colour picker is accessed by way of a ModalBottomSheet. This sample, referred to as “supporting pane”, highlights the supporting dependencies between the primary content material and the colour picker.
- Foldable assist: The app actively checks for foldable machine options. The digital camera display makes use of WindowInfoTracker to get FoldingFeature info to adapt to completely different options by optimizing the structure for tabletop posture.
- Rear show: Help for gadgets with a number of shows is included by way of the RearCameraUseCase, permitting for the machine digital camera preview to be proven on the exterior display when the machine is unfolded (so the primary content material is often displayed on the interior display).
Utilizing window dimension lessons, coupled with making a customized @LargeScreensPreview annotation, helps obtain distinctive and helpful UIs throughout the spectrum of machine sizes and window sizes.
CameraX and Media3 Compose
To permit customers to base their bots on photographs, Androidify integrates CameraX, the Jetpack library that makes digital camera app improvement simpler.
The app makes use of a customized CameraLayout composable that helps the structure of the standard composables {that a} digital camera preview display would come with— for instance, zoom buttons, a seize button, and a flip digital camera button. This structure adapts to completely different machine sizes and extra superior use circumstances, just like the tabletop mode and rear-camera show. For the precise rendering of the digital camera preview, it makes use of the brand new CameraXViewfinder that’s a part of the camerax-compose artifact.
![]()
CameraLayout composable that takes care of various machine configurations, resembling desk high mode
CameraLayout composable that takes care of various machine configurations, resembling desk high mode
The app additionally integrates with Media3 APIs to load an educational video for displaying the way to get the most effective bot from a immediate or picture. Utilizing the brand new media3-ui-compose artifact, we are able to simply add a VideoPlayer into the app:
@Composable
personal enjoyable VideoPlayer(modifier: Modifier = Modifier) {
val context = LocalContext.present
var participant by bear in mind { mutableStateOf(null) }
LifecycleStartEffect(Unit) {
participant = ExoPlayer.Builder(context).construct().apply {
setMediaItem(MediaItem.fromUri(Constants.PROMO_VIDEO))
repeatMode = Participant.REPEAT_MODE_ONE
put together()
}
onStopOrDispose {
participant?.launch()
participant = null
}
}
Field(
modifier
.background(MaterialTheme.colorScheme.surfaceContainerLowest),
) {
participant?.let { currentPlayer ->
PlayerSurface(currentPlayer, surfaceType = SURFACE_TYPE_TEXTURE_VIEW)
}
}
}
Utilizing the brand new onLayoutRectChanged modifier, we additionally pay attention for whether or not the composable is totally seen or not, and play or pause the video based mostly on this info:
var videoFullyOnScreen by bear in mind { mutableStateOf(false) }
LaunchedEffect(videoFullyOnScreen) {
if (videoFullyOnScreen) currentPlayer.play() else currentPlayer.pause()
}
// We add this onto the participant composable to find out if the video composable is seen, and mutate the videoFullyOnScreen variable, that then toggles the participant state.
Modifier.onVisibilityChanged(
containerWidth = LocalView.present.width,
containerHeight = LocalView.present.peak,
) { fullyVisible -> videoFullyOnScreen = fullyVisible }
// A easy model of visibility modified detection
enjoyable Modifier.onVisibilityChanged(
containerWidth: Int,
containerHeight: Int,
onChanged: (seen: Boolean) -> Unit,
) = this then Modifier.onLayoutRectChanged(100, 0) { layoutBounds ->
onChanged(
layoutBounds.boundsInRoot.high > 0 &&
layoutBounds.boundsInRoot.backside < containerHeight &&
layoutBounds.boundsInRoot.left > 0 &&
layoutBounds.boundsInRoot.proper < containerWidth,
)
}
Moreover, utilizing rememberPlayPauseButtonState, we add on a layer on high of the participant to supply a play/pause button on the video itself:
val playPauseButtonState = rememberPlayPauseButtonState(currentPlayer)
OutlinedIconButton(
onClick = playPauseButtonState::onClick,
enabled = playPauseButtonState.isEnabled,
) {
val icon =
if (playPauseButtonState.showPlay) R.drawable.play else R.drawable.pause
val contentDescription =
if (playPauseButtonState.showPlay) R.string.play else R.string.pause
Icon(
painterResource(icon),
stringResource(contentDescription),
)
}
Take a look at the code for extra particulars on how CameraX and Media3 had been utilized in Androidify.
Navigation 3
Display transitions are dealt with utilizing the brand new Jetpack Navigation 3 library androidx.navigation3. The MainNavigation composable defines the completely different locations (House, Digicam, Creation, About) and shows the content material related to every vacation spot utilizing NavDisplay. You get full management over your again stack, and navigating to and from locations is so simple as including and eradicating gadgets from a listing.
@Composable
enjoyable MainNavigation() {
val backStack = rememberMutableStateListOf(House)
NavDisplay(
backStack = backStack,
onBack = { backStack.removeLastOrNull() },
entryProvider = entryProvider {
entry { entry ->
HomeScreen(
onAboutClicked = {
backStack.add(About)
},
)
}
entry {
CameraPreviewScreen(
onImageCaptured = { uri ->
backStack.add(Create(uri.toString()))
},
)
}
// and so on
},
)
}
Notably, Navigation 3 exposes a brand new composition native, LocalNavAnimatedContentScope, to simply combine your shared ingredient transitions while not having to maintain observe of the scope your self. By default, Navigation 3 additionally integrates with predictive again, offering pleasant again experiences when navigating between screens, as seen on this prior shared ingredient transition:
![]()
Be taught extra about Jetpack Navigation 3, at present in alpha.
Be taught extra
By combining the declarative energy of Jetpack Compose, the digital camera capabilities of CameraX, the clever options of Gemini, and considerate adaptive design, Androidify is a personalised avatar creation expertise that feels proper at house on any Android machine. You’ll find the total code pattern at github.com/android/androidify the place you’ll be able to see the app in motion and be impressed to construct your personal AI-powered app experiences.
Discover this announcement and all Google I/O 2025 updates on io.google beginning Could 22.