A few weeks ago, I read a thread on Twitter by a game designer at Riot Games. The thread is about the concept of “celebration” in the world of game design. It refers to the techniques used to reinforce “good” behavior in video games (good not in any objective sense, but simply whatever is desired by the game designer). It was an interesting thread, but I moved on shortly after reading it and didn’t give it much more thought. Until last week when I was reading an article about how social media is designed to be addictive. With the thread still floating around in the back of my mind, what I realized while thinking about the design of social media platforms was that Twitter uses the exact same techniques as video games.
I’ve had an M1 Mac mini for about a month now, so I thought I’d write up my experiences with it. The configuration I got has 16GB of RAM and 256GB of storage (the base amount). The reasoning for the bare minimum storage is that Apple charges an arm and a leg for extra space, and I intend this to primarily be a stopgap machine until there are higher-power Apple Silicon equipped laptops. If I really desperately need extra space down the line, I can buy a cheap external SSD (that will continue to be useful after this computer is gone). The 16GB of RAM, however, is necessary to do any iOS development (Xcode and all the various associated services can by themselves easily consume almost 8 gigs). So far, I’ve moved just about all of my non-work desktop computing over to it, and it’s been absolutely fantastic.
Armed with the ID3 decoder from my last post, we can extract most of the metadata from MP3 files. However, the one piece I still want for my music cataloging software is the track duration, which, for the vast majority of my files, is not included in the ID3 tag. Getting the duration of an audio file isn’t as straightforward as I had hoped. One of the easiest solutions would be to just shell out to another piece of software, such as ffmpeg, which can handle a great many audio formats. But that would be boring, and I wanted to minimize the number of dependencies, which meant writing a rudimentary MP3 decoder myself. Luckily, I don’t need to actually playback audio myself, so I can avoid a great deal of complexity. I only need to parse enough of the MP3 file to figure out how long it is.
On and off for the past year and a half or so, I’ve been working on a small side project to catalog and organize my music library, which is predominantly composed of MP3 files. There are existing pieces of software out there that will do this (such as Beets and Airsonic), but, as many a programmer will attest to, sometimes it’s just more fun to build your own. The choice of language was easy. For a couple years now, Elixir has been my favorite for any back-end web dev. I also had an inkling that its powerful pattern matching facilities could work on arbitrary binary data—perfect for parsing file formats.
I knew that MP3 files had some embedded metadata, only for the reason that looking at most tracks in Finder shows album artwork and information about the track. Cursory googling led me to the ID3 spec.
That’s right, it’s time for this month’s installment of the never ending SwiftUI text view saga! The text view previously implemented is of course auto-expanding and has scrolling disabled. While this mostly works, it has a rather unfortunate UX problem. Let’s say the user is typing into the text view, and they reach the end of the screen. As they continue to type, the text will wrap onto the next line and the caret will go with it. But, because they’re already at the bottom of the screen (or immediately above the bottom of the keyboard), the caret, along with the text that they’re currently typing, will no longer be visible.