Hello there! You might have landed here with the intent of just getting your mitts on the source code to give this a try yourself, and don’t really care for long, theatrical stories like this.
If you’re one of those people, feel free to hop over to the CBRumblr repo (for the frontend) and CBRumblrAPI repo (for the backend) to get started. I like to think both are pretty well-documented, but there’s a guide to get yourself started over here.
The Tea Building, Shoreditch, London. Where our story begins.
Our story begins, like most of ours do, on the second floor of The Tea Building, at a little desk near the entrance we like to call Combo HQ.
Our little startup studio, Combo, has recently taken on a pretty hefty project that’s allowed us to spend some time exploring beacons. The trouble is, actually learning the ins and outs has been nothing short of challenging.
We’ve had a tonne of support from the excellent folks over at Wayfindr, but we wanted to get our hands dirty.
We wanted to push our limits.
We wanted to find out exactly what these bluetooth-bad-boys were capable of.
I don’t even know who this is, but he must be a pretty big deal to have all that money spent on pyrotechnics for him
Remember WWE?
…If I’m being perfectly honest, I don’t.
Well, not particularly well, anyway. My “experience” comes entirely from playing the Playstation games. Looking back, I probably spent most of my time making my own custom players and entrances.
With the recent retiring of The Undertaker storming the headlines, the federation was on my mind.
With my fingers itching, I glanced across the studio floor. A beacon attached clumsily to the studio entrance door. Music blaring over the Sonos.
I was struck with inspiration.
It was time to revisit those times of old.
It was time to become the wrestler I’d always dreamed of becoming.
I grabbed a pen and paper and began scribbling frantically. After less than a minute, I emerged with a rough concept of how I envisioned things would work:
Huh. Seemed fairly simple.
Some seriously huge tunes in that queue.
I hit my first hurdle pretty quickly. I’d foolishly assumed that a Sonos system would have some kind of accessible API — but turns out I was sorely mistaken.
Nonetheless, I wouldn’t be deterred. I fired up Wireshark, and a little packet sniffing showed that all traffic sent from the Mac client were put together in enigmatic XML payloads.
Huh.
Thankfully, a quick search yielded an excellent package called sonos on GitHub, put together by the delightful Ben Evans (cheers, Ben).
With that, I cloned my ES6 boilerplate, fired up Sublime, and got to work.
A quick dive into the library files helped me break things down a little:
Well, that seems fairly straightforward…
…however, once I’d got my teeth stuck in, I ended up encountering random 500 errors on the play method that seemed to derive from the Sonos response itself.
After trawling through endless GitHub issue after GitHub issue, emitting occasional frustrated grunts, I had an epiphany:
This was just a little experiment. Time was fairly short.
Get on with it.
Instead of cueing something up after whatever’s playing currently, I found I could add the track to the beginning of the Sonos queue (position 1), jump to that position, and then play from there.
I also found that I needed to perform an additional step; selectQueue first. If the queue isn’t active (for example, the Sonos is playing a radio station), we need to switch to the queue before we can add our entrance music.
Oh, and for funsies, I threw in another additional step — turn the volume up to 60% before the track starts playing. 🤙
A quick code revision later:
I set up the routes, fired off a request, and bingo.
The Final Countdown victoriously blasted across the second floor of The Tea Building in the middle of (what was) a chilled-out Thursday afternoon.
With my hands raised in the air in victory, I glanced around at the small group of disgruntled-looking individuals now reaching for and plunging earphones into their lobes.
I decided to keep future victories strictly internal.
For the client-side of this project, I figured that an iOS app would be best suited for this.
Why?
Well, Swift’s our bread and butter, and we were keen to create a class that could be abstracted and used for future apps that use beacons.
First off, I picked up a starter pack of beacons. I chose to go with BlueCats, primarily because Axel over at Wayfindr had sang their praises hugely, but also because they had killer support. (To back this up, they actually had the devices delivered by hand to the studio and demo’d in person, which was bloody incredible).
Fun fact: I found out after I’d bought these that I could have actually used another iPhone as a “beacon”. Curse my impulsive “Add To Basket” trigger finger.
With my beacons bought and my server endpoint at the ready, I grabbed for my bright red notebook once again and began scribbling a rough plan of action for the app.
From what I’d read about beacons, this felt fairly reasonable.
Turns out, the iBeacon API gives you a tonne of freedom on what you can do with beacons — but I had a nightmare navigating the documentation for them.
For me, the copy just came across as overly technical, and even getting a simple example out there was fairly taxing.
With overwhelming documentation, not even my strategic beacon placement could save me now.
Nonetheless, I battled through.
I learned a lot, and don’t want to over-do the beacon talk on this post — so if you’re keen to hear about my learnings, I’ve stuck all of my thoughts in a big write-up on beacons and iOS right here.
Unfortunately, another roadblock.
This one came in the form of a realisation that triggering the server in the background might be trickier than I’d hoped.
mfw making this discovery
Whilst I could fire off a URL request successfully in the didDetermineState delegate method (which is fired when a beacon’s been discovered in the background), I wanted to only fire it when we were right next to the door, and to measure the signal strength (as itemised in that big beacon breakdown I mentioned a minute ago), you need to be ranging, not monitoring for beacons.
Of course, to do that, the app needs to be in the foreground. As a result, I ended up deciding that the actual scanning for beacons process would happen in the foreground.
No biggie.
With the flow feeling pretty technically sound, it was time to put some designs together. I threw together something in Sketch in about 5 minutes…
Whilst we’re here, if you’ve not listened to this RX Bandits tune, you absolutely owe it to yourself to give it a try.
…before coming to the conclusion that it looked pretty terrible.
Instead, I recruited the expert design skills of Curtis to put it together for me instead.
That gradient, tho.
Ah. Much better.
With designs equipped, I hacked together the logic mentioned into abstract controllers, and got knee-deep in VFL and UIView.animate() to get the app lookin’ fresh.
That transition tho.
Beacons were sorted. Server was sorted; we were almost there.
The only piece of the puzzle I hadn’t solved was actually getting the song IDs that I’d push through to the Sonos.
Thankfully, Spotify’s Web API is an absolute breeze to use, and to keep things clean, I thought I’d do all Spotify parsing and re-modeling on the server-side to trim the fat from the payload delivered to the device.
I pushed in npm run prod into Terminal.
I hit Command+R on Xcode. My war-torn iPhone 6 sprang into life, launching the shiny-lookin’ Rumblr iOS app.
And our very first tester, the honourable Marco Martignone, had selected AC/DC’s “Back In Black” as his entrance music.
He disappeared through the second floor doors as we lay in wait, our breath baited…and, well…
…let’s just say this little clip tells the story a lot better than I could.
Beacons are brilliant fun.
Yeah, they’re great for for toys like this — but what really excites us is the potential. Beacons can (and should!) be used in incredible situations to benefit others (like the aforementioned Wayfindr).
Finally, putting this source code “out there” is a really big deal for us, too.
A lot of the learnings we’ve made wouldn’t have even been possible without the incredible help of some of the open-source repos we found on GitHub, and so we’re really happy to add to that by open sourcing the frontend and backend source code over on our GitHub.
With this, we’re hoping that by adding our learnings in putting together this silly little project, we can help move the needle a little further forward into what the future of beacons could be.
On top of that, we’re working on abstracting the learnings we’ve taken away from putting together this little product into a wrapper class to make navigating the tricky waters of beacon monitoring and ranging just a tad easier.
Open sourcing isn’t really something we’ve done at Combo before, but it’s exciting — and as long as it’s helping someone, somewhere; we’re keen to keep doing it, too.
We also like to think what we’ve put together might put a smile on someone’s face in some dreary office somewhere — so let us know if you’ve used this and have had some fun with it by dropping me an email on [email protected].