So, about six months ago I decided I wanted to rewrite my perfectly-working blog backend in Rust. Why? Because I was bored and wanted an excuse to use Rust more.
I recently got nerd sniped by this tweet from Zhuowei Zhang about playing the Bad Apple video in the Dynamic Island on the iPhone 14 Pro. His original implementation used a webpage and the media session API, and this worked, but the system plays an animation when the artwork changes, so the framerate was limited to 2 FPS. Not ideal for watching a video. So, I wanted to see how much closer to watchable I could get.
This post isn’t going to be a detailed guide or anything, just a collection of some mildly interesting things I learned.
I’m very excited for the project I’ve been working on all year to finally be public. LiveView Native is a library that lets you build native apps backed by Phoenix LiveView. I’ve been developing the iOS client which is backed by SwiftUI.
Using LiveView Native lets avoid duplicating business logic on the frontend and save time on implementing dedicated APIs for native apps. The iOS client can be integrated into any existing app by using a single SwiftUI view, so it’s easy to adopt it for just a single screen at a time.
We’ve also developed a simple chat app which was used by the attendees of ElixirConf this year, and serves as a complete example of a LiveView Native app.
I’m very excited to see what people build with it.
With iOS 16, Apple switched on TextKit 2 for UITextViews. But, if you access any of the TextKit 1 objects on the text view, it will automatically fall back to a compatibility mode. All of the work I did to mimic Safari’s link context menu animation was, of course, using the TextKit 1 APIs, so it was blocking me from fully adopting TextKit 2. So, here’s how to update that code.