Context
Naturally, iCloud photo sharing is limited to iOS/MacOS devices. For those without Apple’s devices, you’re allowed to create a web link to each shared album, which I’ve been using on my Android until now.
AirPhoto solves a couple issues with the existing web app:
- The Album’s images are heavily compressed and a lower resolution.
- Comments are not shown.
- Likes are not shown.
- Albums have different web links, making several bookmarks or home screen icons required.
- More often than not, the web app takes several minutes to render even a single image.
This clearly isn’t a viable alternative from Apple, they have no incentive to make it so. I’ve been working on my own solution in the meantime.
The Challenge
MacOS caches Cloud data on the machine locally, with daemons periodically refreshing from Apple’s servers. That data is cached as relatively simple sqlite databases. Several tables, when joined, give complete information for each Album, each photo in those albums, and all the comments/likes on those photos.
Easy enough!
One hurdle is that some chunks of data are stored in the table as a binary plist file. AirPhoto needs to be able to decode plist data on the host machine to make sense of each entry, and there isn’t much support for that outside of Swift/Obj C.
It became clear that AirPhoto, similar to its spiritual cousin AirMessage, needs to be broken into 2 parts:
- A server running on a MacOS machine.
- A client app.
The Solution
I’ve broken down the server component into its logical constructs, being Albums, Photos (assets), and Comments. This follows the structure defined by Apple in the sqlite database and makes jumping between them easier.
I put the messy but reusable plist parser into its own go module. This implements the plist conversion tool plutil and parses that output for the desired information.
When starting the server, it does an initial walk through of every asset of every album. Because each plist object needs to be individually parsed, this startup often takes a while.
After starting the server watches the sqlite file for changes and refreshes the album for new photos and the assets for new comments/likes when necessary. I added a toggle for sending Firebase notifications (missing from Apple’s web solution) when new activity is detected by the server.
Finally, we open a port on 1459 exposing a REST API to query all the gathered information.
The server does a fair bit of allocation, but memory usage remains low and can run in the background on a dual core 2013 laptop.
Front-end Development
I’m continuing to dip my toes into React Native development with this client app, this time using the UI framework UI Kitten. This gave me quite a leg up in development time, as the example app and detailed documentation got AirPhoto up and running in no time.
The design is very simple, barebones, but functional. I wanted a clean white experience with no branding. The Album’s page is an infinite scroll experience with lazy loading images to save bandwidth and other resources.
Next Steps
Parsing video files.
Right now AirPhoto only parses images. Video files are now supported!
Converting to Redux for state management
As this app grows, using a more complex tool to handle state and HTTP requests would be pertinent.
Sending Comments and Likes
AirPhoto is unfortunately read-only at the moment. Any writes to the sqlite database are discarded by the daemon, and unlike iMessage - Apple does not expose any Automator API for posting or interacting with iCloud photos.
Later down the road it may be worthwhile to sniff outgoing packets to iCloud servers with Wireshark, but currently thats much more effort than walking over to my MacBook to respond.