9.1 C
New York
Wednesday, March 12, 2025

Bump’s Journey into Pleasant Experiences on Android with Jetpack Compose | by Android Builders | Android Builders | Mar, 2025


This can be a visitor put up from amo Engineer, Cyril Mottier, on how they’ve leveraged Android to create magic of their Android app.

At amo, we’re redefining what it means to construct social purposes. Our mission is to create a brand new form of social firm, one which prioritizes high-quality, thoughtfully designed cell experiences. One in every of our flagship purposes, Bump, places your mates on the map — whether or not you’re checking in in your crew or making strikes to satisfy up.

Our app leverages multiplatform applied sciences for its basis. On the core lies a shared Rust-based library that powers all of our iOS and Android apps. This library, managed by our backend engineers, is answerable for persistence and networking. The library exposes its APIs as Kotlin Movement. Along with making all the pieces reactive and realtime-enabled by default, it integrates effortlessly with Jetpack Compose, the expertise we use to construct our UI. This structure ensures a constant and high-performance expertise throughout platforms. It additionally permits cell engineers to spend extra time on the consumer expertise the place they’ll deal with crafting modern and immersive consumer interactions.

On this put up, we are going to discover how we leverage the Android SDK, Jetpack Compose, the Kotlin programming language, and Google Play Companies to construct distinctive, pleasant experiences in Bump. Our purpose is to interrupt psychological limitations and present what’s really attainable on Android — typically in a only a few traces of code. We wish to encourage engineers to assume past typical UI paradigms and discover new methods to create magical moments for customers. By the top of this text, you’ll have a deeper understanding of tips on how to harness Android’s capabilities to construct experiences that really feel like magic.

At amo, we ship, be taught, and iterate quickly throughout our characteristic set. That implies that the design of a number of the options highlighted on this article has already modified, or will accomplish that within the coming weeks and months.

Nice touch-based UX isn’t nearly flashy visuals. It’s about delivering significant suggestions by graphics, haptics, sounds, and extra. Stated in a different way, it’s about designing for all senses, not only for the eyes. We take this very significantly when designing purposes and at all times take all of those potential dimensions under consideration.

One instance is our in-app notification heart. The notification heart is a visible entry level accessible from anyplace within the app which reveals your whole notifications from the complete amo suite of apps. It may be moved anyplace on the display. Its model additionally adjustments repeatedly because of some in-residence or exterior artists. However styling doesn’t cease on the visible degree, we additionally model it on the audio degree: when it’s dragged round, a brief and repeating sound is performed.

Flip the sound on for the total expertise

To make it enjoyable and joyful, we pushed this even additional to let the consumer be a DJ. The amount, the velocity and the pitch of the audio change relying on the place the consumer drags it. It’s a “be your individual DJ” second. The implementation of this expertise could be cut up in two elements. The primary half offers with the audio and the second half handles the rendering of the entry level and its interactions (dragging, tapping, and so forth.).

Let’s first dive into the code dealing with the audio. It consists of a Composable requiring an URL pointing to the music, a flag indicating whether or not it ought to play or not (true solely when dragging) and a two-dimensional offset: X axis controls the amount, Y axis controls the playback velocity & pitch.

@Composable
enjoyable GalaxyGateAccessPointMusicPlayer(
musicUrl: String,
isActive: Boolean,
offset: Offset,
) {
val audioPlayer = rememberAudioPlayer(
uri = Uri.parse(musicUrl),
)
LaunchedEffect(audioPlayer, isActive) {
if (isActive) {
audioPlayer.play(isLooped = true)
} else {
audioPlayer.pause()
}
}

SideEffect {
audioPlayer.setSpeedPitch(
velocity = 0.75f + offset.y * 0.5f,
pitch = offset.x + 0.5f
)
audioPlayer.setVolume(
(1.0f - ((offset.x - 0.5f) * 2f).coerceIn(0f, 1f)),
(1.0f - ((0.5f - offset.x) * 2f).coerceIn(0f, 1f)),
)
}
}

@Composable
enjoyable rememberAudioPlayer(
uri: Uri,
): AudioPlayer {
val context = LocalContext.present
val lifecycle = LocalLifecycleOwner.present.lifecycle
return keep in mind(context, lifecycle, uri) {
DefaultAudioPlayer(
context = context,
lifecycle = lifecycle,
uri = uri,
)
}
}

DefaultAudioPlayer is simply an in-house wrapper round ExoPlayer offered by Jetpack Media3 that offers with initialization, lifecycle administration, fading when beginning/stopping music, and so forth. It exposes 2 strategies setSpeedPitch and setVolume delegating to the underlying ExoPlayer.

By combining gesture readings with audio pitch, velocity and quantity, we added delight and shock when customers didn’t count on it.

We named our utility “Bump” as a nod to its core characteristic: individuals shut to one another can “bump” their telephones collectively. If they aren’t registered as associates on the app, bumping will robotically ship a pal request. And if they’re, a mesmerizing animation triggers. We additionally notify mutual associates that they “bumped” and they need to be part of.

This Bump characteristic is central to the app’s expertise. It stands out in its interplay, performance, and the distinctive worth it supplies. To specific its significance, we wished the characteristic to have a particular visible attraction. Right here’s a glimpse of the way it at present appears within the app:

There’s a lot occurring on this video however it may be summarized to three animations: the “wave” animation when the gadget detects an area bump/shake, the animation displaying the 2 associates bumping and lastly a “ring pulsing” animation to complete. Whereas the second animation is apparent Compose, the 2 others are customized. Creating such customized results concerned venturing into what is usually thought-about “unknown territory” in Android growth: customized shaders. Whereas daunting at first, it’s really fairly accessible and unlocks immense inventive potential for really distinctive experiences.

Merely put, shaders are extremely parallelizable code segments. Every shader runs as soon as per pixel per body. This may sound intense, however that is exactly the place GPUs excel. In Android 13, shaders have been built-in as first-class residents with AGSL shaders and RuntimeShader for Views and Compose.

Since our app requires a minimal of API 30 (Android 11), we opted for a extra conventional strategy utilizing a customized OpenGL renderer.

We extract a Bitmap of the view we wish to apply the impact to, go it to the OpenGL renderer, and run the shader. Whereas this technique ensures retro-compatibility, its essential downside is that it operates on a snapshot of the view hierarchy all through the animation. In consequence, any adjustments occurring on the view throughout the animation aren’t mirrored on display till the animation concludes. Eager observers may discover a slight glitch on the finish of the animation when the screenshot is eliminated, and regular rendering resumes.

In our apps, profile footage are a bit totally different. As a substitute of static pictures, you report a dwell profile image, a clear, animated boomerang-like cutout. This strategy feels extra private, because it actually brings your mates to life on-screen. You see them smiling or making faces, reasonably than simply viewing curated or filtered photographs from their digicam roll.

From a product perspective, this characteristic includes two key phases: the recording and the rendering. Earlier than diving into these particular areas, let’s focus on the format we use for knowledge transport between cell units (Android & iOS) and the server. To optimize bandwidth and decoding time, we selected the H.265 HEVC format in an MP4 container and carry out the face detection on gadget. Most trendy units have {hardware} decoders, making decoding extremely quick. Since cross-platform movies with transparency aren’t extensively supported or optimized, we developed a customized in-house answer. Our movies include two “planes”:

  • The unique video on high
  • A masks video on the backside

We haven’t but optimized this course of. At present, we don’t pre-apply the masks to the highest airplane. Doing so may scale back the ultimate encoded video dimension by changing the unique background with a plain coloration.

This format is pretty efficient. For example, the video above is barely 64KB. As soon as we aligned all cell platforms on the format for our animated profile footage, we started implementing it.

Recording a Dwell Profile Image

Step one is capturing the video, which is dealt with by Jetpack CameraX. To supply customers with visible suggestions, we additionally make the most of ML Package Face Detection. Initially, we tried to map detected facial expressions (equivalent to eyes closed or smiling) to a 3D mannequin rendered with Filament. Nevertheless, attaining real-time efficiency proved too difficult for the timeframe we had. We as an alternative determined to detect the face contour and to maneuver a default avatar picture on the display accordingly.

As soon as the recording is full, Jetpack CameraX supplies a video file containing the recorded sequence. This marks the start of the second step. The video is decoded body by body, and every body is processed utilizing ML Package Selfie Segmentation. This API computes the face contour from the enter picture (our frames) and produces an output masks of the identical dimension. Subsequent, a composite picture is generated, with the unique video body on high and the masks body on the backside. These composite frames are then fed into an H.265 video encoder. As soon as all frames are processed, the video meets the specs described earlier and is able to be despatched to our servers.

Whereas the method may very well be improved with higher interframe selfie segmentation, utilization of depth sensors, or extra superior AI strategies, it performs properly and has been efficiently operating in manufacturing for over a 12 months.

Rendering Your Associates on the Map

Taking part in again animated profile footage introduced one other problem. The principle issue arose from what appeared like a easy product requirement: displaying 10+ real-time shifting profile footage concurrently on the display, animating in a back-and-forth loop (much like boomerang movies). Video decoders, particularly {hardware} ones, excel at decoding movies ahead. Nevertheless, they wrestle with reverse playback. Moreover, decoding is computationally intensive. Whereas decoding a single video is manageable, decoding 10+ movies in parallel will not be. Our requirement was akin to wanting to look at 10+ motion pictures concurrently in your favourite streaming app, all in reverse mode. That is an unusual and distinctive use case.

We overcame this problem by buying and selling computational wants for elevated reminiscence consumption. As a substitute of repeatedly decoding video, we opted to retailer all frames of the animation in reminiscence. The video is a 30fps, 2.5-second video with a decision of 256×320 pixels and transparency. This ends in a reminiscence consumption of roughly 24MB per video. A queue-based system dealing with decoding requests sequentially can handle this effectively. For every request, we:

  1. Decode the video body by body utilizing Jetpack Media3 Transformer APIs
  2. For every body:
  • Apply the decrease a part of the video as a masks to the higher half.
  • Append the generated Bitmap to the listing of frames.

Upon finishing this course of, we acquire a Record containing all of the ordered, reworked (mask-applied) frames of the video. To animate the profile image in a boomerang method, we merely run a linear, infinite transition. This transition begins from the primary body, proceeds to the final body, after which returns to the primary body, repeating this cycle indefinitely.

@Immutable
class MovingCutout(
val period: Int,
val bitmaps: Record,
) : Cutout

@Composable
enjoyable rememberMovingCutoutPainter(cutout: MovingCutout): Painter {
val state = rememberUpdatedState(newValue = cutout)
val infiniteTransition = rememberInfiniteTransition(label = "MovingCutoutTransition")
val currentBitmap by infiniteTransition.animateValue(
initialValue = cutout.bitmaps.first(),
targetValue = cutout.bitmaps.final(),
typeConverter = state.VectorConverter,
animationSpec = infiniteRepeatable(
animation = tween(cutout.period, easing = LinearEasing),
repeatMode = RepeatMode.Reverse
),
label = "MovingCutoutFrame"
)
return keep in mind(cutout) {
// A customized BitmapPainter implementation to permit delegation when getting
// 1. Intrinsic dimension
// 2. Present Bitmap
CallbackBitmapPainter(
getIntrinsicSize = {
with(cutout.bitmaps[0]) { Measurement(width.toFloat(), peak.toFloat()) }
},
getImageBitmap = { currentBitmap }
)
}
}

personal val State.VectorConverter: TwoWayConverter
get() = TwoWayConverter(
convertToVector = { AnimationVector1D(worth.bitmaps.indexOf(it).toFloat()) },
convertFromVector = { worth.bitmaps[it.value.roundToInt()] }
)

As a map-based social app, Bump depends closely on the Google Maps Android SDK. Whereas the framework supplies default interactions, we wished to push the boundaries of what’s attainable. Particularly, customers wish to zoom out and in rapidly. Though Google Maps presents pinch-to-zoom and double-tap gestures, these have limitations. Pinch-to-zoom requires two fingers, and double-tap doesn’t cowl the total zoom vary.

For a greater consumer expertise, we’ve added our personal gestures. One notably helpful characteristic is edge zoom, which permits speedy zooming out and in utilizing a single finger. Merely swipe up or down from the left or proper fringe of the display. Swiping right down to the underside zooms out utterly, whereas swiping as much as the highest zooms in totally.

Like Google Maps gestures, there aren’t any visible cues for this characteristic, however it’s acceptable for an influence gesture. We offer visible and haptic suggestions to assist customers keep in mind it. At present, that is achieved with a glue-like impact that follows the finger, as proven under:

Implementing this characteristic includes two duties: detecting edge zoom gestures and rendering the visible impact. Due to Jetpack Compose’s versatility, this may be performed in only a few traces of code. We use the draggable2D Modifier to detect drags, which triggers an onDragUpdate callback to replace the Google Maps digicam and triggers a recomposition by updating some extent variable.

@Composable
enjoyable EdgeZoomGestureDetector(
aspect: EdgeZoomSide,
onDragStarted: () -> Unit,
onDragUpdate: (Float) -> Unit,
onDragStopped: () -> Unit,
modifier: Modifier = Modifier,
curveSize: Dp = 160.dp,
) {
var heightPx by keep in mind { mutableIntStateOf(Int.MAX_VALUE) }
var level by keep in mind { mutableStateOf(Offset.Zero) }
val draggableState = rememberDraggable2DState { delta ->
level = when (aspect) {
EdgeZoomSide.Begin -> level + delta
EdgeZoomSide.Finish -> level + Offset(-delta.x, delta.y)
}
onDragUpdate(delta.y / heightPx)
}
val curveSizePx = with(LocalDensity.present) { curveSize.toPx() }

Field(
modifier = modifier
.fillMaxHeight()
.onPlaced {
heightPx = it.dimension.peak
}
.draggable2D(
state = draggableState,
onDragStarted = {
level = it
onDragStarted()
},
onDragStopped = {
level = level.copy(x = 0f)
onDragStopped()
},
)
.drawWithCache {
val path = Path()
onDrawBehind {
path.apply {
reset()
val x = level.x.coerceAtMost(curveSizePx / 2f)
val y = level.y
val high = y - (curveSizePx - x)
val backside = y + (curveSizePx - x)
moveTo(0f, high)
cubicTo(
0f, high + (y - high) / 2f,
x, high + (y - high) / 2f,
x, y
)
cubicTo(
x, y + (backside - y) / 2f,
0f, y + (backside - y) / 2f,
0f, backside,
)
}

scale(aspect.toXScale(), 1f) {
drawPath(path, Palette.black)
}
}
}
)
}

enum class EdgeZoomSide(val alignment: Alignment) {
Begin(Alignment.CenterStart),
Finish(Alignment.CenterEnd),
}

personal enjoyable EdgeZoomSide.toXScale(): Float = when (this) {
EdgeZoomSide.Begin -> 1f
EdgeZoomSide.Finish -> -1f
}

The drawing half is dealt with by the drawBehind Modifier, which creates a Path consisting of two easy cubic curves, emulating a Gaussian curve. Earlier than rendering it, the trail is flipped on the X axis based mostly on the display aspect.

This impact appears good however it additionally feels static, immediately following the finger with none animation impact. To enhance this, we added spring-based animation. By extracting the computation of x (representing the tip of the Gaussian curve) from drawBehind into an animatable state, we obtain a smoother visible impact:

val x by animateFloatAsState(
targetValue = level.x.coerceAtMost(curveSizePx / 2f),
label = "animated-curve-width",
)

This creates a visually interesting impact that feels pure. Nevertheless, we wished to have interaction different senses too, so we launched haptic suggestions to imitate the texture of a toothed wheel on an previous secure. Utilizing Kotlin Movement and LaunchedEffect and snapshotFlow this was applied in a only a few traces of code:

val haptic = LocalHapticFeedback.present
LaunchedEffect(heightPx, slotCount) {
val slotHeight = heightPx / slotCount
snapshotFlow { (level.y / slotHeight).toInt() }
.drop(1) // Drop the preliminary "tick"
.accumulate {
haptic.performHapticFeedback(HapticFeedbackType.SegmentTick)
}
}

Bump is crammed with many different modern options. We invite you to discover the product additional to find extra of those gems. General, the complete Android ecosystem — together with the platform, developer instruments, Jetpack Compose, Google Play Companies — offered many of the needed constructing blocks. It provided the pliability wanted to design and implement these distinctive interactions. Due to Android, making a standout product is only a matter of ardour, time, and quite a lot of traces of code!

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles