Three ways to share
Each mode solves a different problem — from simple SwiftUI rendering to full compositor capture to frame-by-frame video encoding.
Plain Sticker
PNGNo background — just the nutrition sticker as a transparent PNG. Uses SwiftUI's ImageRenderer directly.
Image + Sticker
JPEGGlass sticker composited onto a photo. Captured from the live screen via drawHierarchy.
Video + Sticker
MP4Every frame decoded, composited with glass in an offscreen window, captured, and re-encoded.
The GlassBackground modifier
Every sticker variant wraps its content in a .glassBackground() modifier that adapts based on three environment flags — choosing between native liquid glass, semi-transparent fallbacks, or transparent mode for video compositing.
Content clipped to shape only. No background. Used during video export.
.clipShape(shape)iOS 26 native liquid glass. Blur, refraction, and specular highlights automatic.
.glassEffect(.clear, in: ...)Layered semi-transparent backgrounds with gradient border stroke.
black + white + gradient strokeSemi-transparent white background with thin black border.
white(0.75–0.95) + strokePhoto export — one-shot capture
When the user shares a sticker on a photo, we need to composite the glass effect at full resolution. The trick: capture the glass compositor from the actual screen, then blend it onto the high-res background.
Crop the background
Calculate the visible crop region in original pixel coordinates — accounting for display scale and user panning.
let pixelScale = imgPixelW / displayed.width let visibleOriginX = (displayed.width - frameSize.width) / 2 - imagePanOffset.width let cropRect = CGRect( x: visibleOriginX * pixelScale, y: visibleOriginY * pixelScale, width: frameSize.width * pixelScale, height: frameSize.height * pixelScale )
Capture the glass from screen
ImageRenderer can't capture glass effects — they live in Apple's compositor. Instead, we capture the actual key window via drawHierarchy.
let window = windowScene.windows .first(where: { $0.isKeyWindow }) let fullScreenImage = renderer.image { _ in window.drawHierarchy( in: window.bounds, afterScreenUpdates: true ) }
Composite at full resolution
Two images layered: full-res cropped background and screen capture with glass. The blur makes the upscale imperceptible.
let finalImage = renderer.image { _ in UIImage(cgImage: croppedBgCG).draw(in: fullRect) UIImage(cgImage: screenCropCG).draw(in: fullRect) }
Export as JPEG
Compressed to JPEG at 95% quality, saved to a temp file, presented via the system share sheet.
Video export — frame by frame
Video is where things get wild. We can't just overlay a static sticker — the glass effect must update every frame because the background changes. So we decode each frame, render it into an offscreen window with the glass sticker, capture the compositor output, and encode it to a new video.
Build the offscreen render window
A hidden UIWindow at level -1000. Inside: a UIImageView for video frames, and a UIHostingController with .glassEffect.
let window = UIWindow(windowScene: windowScene) window.windowLevel = UIWindow.Level(rawValue: -1000) window.isHidden = false // must be visible to compositor let stickerContent = variant.stickerView(for: meal) .environment(\.stickerUseGlassEffect, true) let hostingVC = UIHostingController(rootView: stickerContent)
Decode video frames
Each frame via AVAssetReader as 32BGRA. We handle orientation via CIImage.oriented(), then crop to the 9:16 export region.
var frameCI = CIImage(cvPixelBuffer: pixelBuffer) frameCI = frameCI.oriented(videoOrientation) frameCI = frameCI .cropped(to: cropRect) .transformed(by: CGAffineTransform( translationX: -cropRect.origin.x, y: -cropRect.origin.y ))
Two-pass glass capture
The heart of the video pipeline. For each frame, a two-pass drawHierarchy:
- ▸Pass 1: Force the compositor to process the new background
- ▸Pass 2: Capture the output with glass now reflecting the current frame
backgroundImageView.image = frameUIImage renderWindow.rootViewController?.view.layoutIfNeeded() // Pass 1: force compositor update _ = renderer.image { _ in renderWindow.drawHierarchy( in: renderWindow.bounds, afterScreenUpdates: true) } // Pass 2: capture with glass reflecting current frame let captured = renderer.image { _ in renderWindow.drawHierarchy( in: renderWindow.bounds, afterScreenUpdates: true) }
Encode to HEVC
Captured bitmap → CVPixelBuffer via CIContext.render() → AVAssetWriter. HEVC-encoded at export frame size × screen scale.
Mux audio back in
Original audio re-attached via AVMutableComposition + AVAssetExportSession with passthrough — no re-encoding.
Clean up and share
Offscreen window deallocated. Intermediate file deleted. Final MP4 presented via share sheet with progress bar throughout.
Full pipeline diagrams
Image Export
Video Export
← repeat for every frame →
Image vs Video
Image Export
- →Single drawHierarchy call on live key window
- →Background cropped from original full-res image
- →Glass captured at screen resolution, composited over full-res
- →Output: JPEG at 95% quality
- →Near-instant — feels immediate
- →No offscreen window needed
Video Export
- →Two drawHierarchy calls per frame on offscreen window
- →Each frame decoded via AVAssetReader + CIImage
- →Offscreen UIWindow with real glass compositor
- →Output: HEVC-encoded MP4 with muxed audio
- →Progress bar — can take 10–30s
- →Cancellable via Task with cleanup
Why drawHierarchy is everything
The entire sticker export system is built around one limitation: glass effects can't be captured through SwiftUI's ImageRenderer. They only exist in the compositor.
ImageRenderer
Renders the SwiftUI view tree directly — but glass effects are compositor-level. You get a flat, un-glassed render.
drawHierarchy
Captures what the CALayer tree actually looks like on screen — including backdrop filters, glass, and blending.
The tradeoff
drawHierarchy is slow — it forces a full compositor pass. For video, hundreds of calls = progress bar.
The UIWindow trick
For video, we can't use the live screen. So we create a hidden window at level -1000 — invisible but composited.
5 sticker designs
Each variant uses the same glass system but shows different levels of detail. The user swipes through them in a carousel (no background) or taps to cycle (with background).
The stack
SwiftUI Layer
.glassEffect(.clear)Native liquid glassGlassBackgroundCustom ViewModifierStickerPaletteGlass-aware colorsTextElementBackgroundInner element materialsEnvironment flagsopacity · glass · transparentCapture + Export Layer
drawHierarchyCompositor captureUIWindow (level -1000)Offscreen render targetAVAssetReader/WriterFrame decode/encodeCIContext + CIImageOrient, crop, renderAVMutableCompositionAudio muxingUIHostingControllerSwiftUI → UIKit bridge