Live Activities (and Bad Apple)

I recently got nerd sniped by this tweet from Zhuowei Zhang about playing the Bad Apple video in the Dynamic Island on the iPhone 14 Pro. His original implementation used a webpage and the media session API, and this worked, but the system plays an animation when the artwork changes, so the framerate was limited to 2 FPS. Not ideal for watching a video. So, I wanted to see how much closer to watchable I could get.

This post isn’t going to be a detailed guide or anything, just a collection of some mildly interesting things I learned.

Before I started this, I was already aware that Live Activity updates had a 4KB limit on the size of the dynamic state. So, the first thing I worked on was encoding the video into a smaller format. Because this particular video is black and white, I encoded it as 1 bit-per-pixel. Each frame is also complete, so the widget doesn’t need access to any previous frames to display the current one.

Since I wanted to display it in the Dynamic Island, the video is also scaled down to be very small, 60 × 45 pixels—one eighth the resolution of the original. This is a convenient size because each row of pixels can be encoded as a single UInt64. The entire frame comes out to 45 × 8 = 360 bytes, which is plenty small.[1]

The whole video is encoded when the app starts up, which takes about 8 seconds. That’s faster than real time (the video is 3m39s), so it could be done during playback, but doing it ahead of time is fast enough that I didn’t feel like putting in the work to optimize a shitpost.

The widget can then unpack the encoded frame into a bitmap that can be turned into an image.

Adding the Live Activity wasn’t difficult—the API is wonderfully straightforward—but, alas, updating it was not so.

While the app was in the foreground, ActivityKit would log a message whenever I asked it to update the activity. But, when the app went into the background, those messages were no longer logged—even though my code was still running and requesting updates. Interestingly, the app is considered to be in the foreground if you open Notification Center while in the app, and so the activity can be updated, which is how this demo came about:

I scratched my head at the issue of background updates for a while, and tried a couple things to no avail, until I attached Console.app to my phone and filtered for “activity”. At which point, I saw a bunch of messages like these from sessionkitd (which is the system daemon that manages live activities):

com.apple.activitykit sessionkitd xpc Process is playing background media and forbidden to update activity: 984

Apps playing background audio seem to be completely forbidden from updating Live Activities. The only possible reason for this I can imagine is to prevent apps from making their own now playing activities, rather than relying on the system one. I don’t know why this is the case, but whatever, back to trying to find workarounds.

At this point, I downloaded the most recent iOS 16 IPSW for the iPhone 14 Pro. After extracting the dyld shared cache from the image (using this helpful tool) I started poking around to try and find the code responsible for deciding if an app is allowed to update its activities. Opening the dyld shared cache in Hopper and searching for “session” revealed several promising-sounding frameworks: SessionCore, SessionFoundation, and SessionKit.

SessionCore turned out to be the one containing that log message. But unfortunately, it’s written in Swift and I am not good enough at reverse engineering to decipher what it’s actually doing. I did, however, manage to find a couple tidbits just be looking through the strings in the binary:

  1. An entitlement named com.apple.private.sessionkit.backgroundAudioUpdater

It wasn’t of much use to me, but if someone ever manages to jailbreak these devices, you could have some fun with it.

  1. A log message reading “Process looks like a navigation app and can update activity”

This looked more promising because the phrasing “looks like” suggests to me that it’s just using a heuristic, rather than determining what the app is for certain. I tried to trick it by adding entitlements for MapKit and location, tracking the user’s location while in the background, and trying various audio session categories that seemed more map-app like. But this effort was to no avail, sessionkitd still forbade me from updating the activity in the background.

At this point, I gave up on getting audio working and just settled for playing the video in the Dynamic Island. I had to scrap the previous implementation of detecting new frames because it used AVPlayer and is therefore incompatible with updating in the background. But, since I have an array of frames, playing it back myself is as simple as using a timer to emit a new one every 1/30th of a second.

With that, I was finally able to play back the video in the island:

You may notice that in both attempts the video appears somewhat blurry. This is becuase iOS animates any changes to the Live Activity view. As with all widgets, the SwiftUI view tree is serialized and then deserialized and displayed in a separate process, so you don’t have direct control over the animation. There is a new ContentTransition API that the Live Activity docs say should work, but unfortunately the identity transition “which indicates that content changes should’t animate” has no effect on the activity.

Just having the video in the island was pretty satisfying to see. But, I still really wanted to see the video playing in the island with the audio.

Zhuowei suggested using the system Music app to play back the sound while my app controlled the video. This worked, though it means the app is no longer entirely self-contained. You can use the MPMusicPlayerController.systemMusicPlayer to control the system Music app, and playing back a particular track is simple enough:

let player = MPMusicPlayerController.systemMusicPlayer
let query = MPMediaQuery.songs()
query.addFilterPredicate(MPMediaPropertyPredicate(value: "badapple", forProperty: MPMediaItemPropertyTitle))
player.setQueue(with: query)
do {
	try await player.prepareToPlay()
	player.play()
} catch {
	// if the song doesn't exist, ignore it
}

Annoyingly, the only way of getting the track into the Music app on my phone was by disabling iCloud Music Library and syncing it from my Mac. Why iCloud needs to be disabled to let you manually sync files, I do not know—especially seeing as the manually-synced tracks remain on the device even after iCloud is turned back on.

And this, it turns out, is impossible to record because apparently the Music app mutes itself during screen recordings (using the builtin Control Center method, or attaching to QuickTime on a Mac). So, you’ll just have to take my word for it.

This was a fun, if utterly pointless, endeavor. If you want to see the code or run it yourself, it’s available here.


  1. Zhuowei pointed out that by using a palette, you could get even get a color frame the same size into 2.8 kilobytes without even getting into any fancy encoding techniques. ↩︎

Comments

Comments powered by ActivityPub. To respond to this post enter your username and instance below, or copy its URL into the search interface of your client for Mastodon, Pleroma, or other compatible software. Learn more.

Reply from your instance: