I Built a Botnet that Could Destroy Spotify with Fake Listens
Automated streaming is the next frontier of click fraud.
Did you know you can leave a muted Spotify playlist on repeat all night and generate roughly 72 cents for your favorite band? Or that you could previously leave a browser tab of Eternify open all day and net the band $2.30?
Better yet, did you know you can program a botnet on your old laptop to generate $30 a day in fake Spotify listens?
These gratuities may seem harmless or even deserved, but they foreshadow a major vulnerability in the current model of online music streaming. Just as publishers learned about click farming, streaming music services are learning about listen farming. And if automated listening continues unchallenged, music streaming may cease to provide any meaningful income for legitimate (even popular) musicians.
Peter Fillmore, a security consultant in Melbourne, was among the first to demonstrate that automated programs could generate massive royalties back in 2013 by having software-based "robots" listen to his own (comically horrible) music nonstop.
Fillmore made around $1,000 in royalties and topped the Australian charts of streaming service Rdio, he says his motivations were benign. "I was focused more on working out what mechanisms were there to prevent this type of fraud—and what the potential payouts would be," he told me in an email.
It was mesmerizing to watch the plays rack up
In the time since Fillmore publicized this exploit, music streaming companies have been tight-lipped about the possibility of musical click fraud. Bloggers, however, have noticed the elephant in the room. In the wake of stunts like Vulfpeck pocketing $20,000 by having fans listen to silent songs and Eternify turning streaming fraud into an app, some have entertained the possibility of what would happen if large-scale botnets turned this trickle of fake plays into a torrent.
I decided to prototype a robot with an endless appetite for music to see if Spotify could detect what it was doing.
Here is what I coded into life:
First, a remote server used browser automation to sign up for Spotify accounts with randomly generated names, ages, and email addresses. This gave me a limitless supply of accounts to stream songs, so as not to alert Spotify by having a handful of users with inhuman amounts of activity.
A central command server periodically sent out Spotify login credentials to cloud servers (or repurposed personal computers) running dozens of Spotify clients, all masked behind virtual private networks. Each "user" logged in, listened to a few hours of music, then logged out. Their playlists were random selections from various artists I like. Then, I deployed the botnet using a patchwork of free cloud instances and my own hardware.
It was mesmerizing to watch the plays rack up. Unknown albums from minor celebrities I adore suddenly had tens of thousands of hits, where before they had virtually none. With minimal effort, I was generating $32.26 per day in royalties. Inevitably, my thoughts wandered to greed: how profitable would this music royalty factory be if I turned it on music I owned the rights to?
Data from my relatively small-scale operation suggested I could locate 50 Spotify clients and on a memory-optimized 15 GB cloud server from Amazon Web Services and fake listens for a cost of 0.003 to 0.012 cents per song. (The exact cost depends on how frequently the robotic listeners hit the "skip" button.) A royalty report I recently received from a musician colleague suggested that artists' take for ad-supported listeners was 0.08 cents per song (this number varies over time and between publishers), putting a conservative estimate for the rate of return of automated streaming at over 600 percent, assuming that one receives all the royalties for the music streamed.
That kind of "magic internet money" puts Bitcoin mining to shame—and I don't need to explain the nonfinancial reasons why a musician might want a slice of the 18,000 to 144,000 (again, depending on song skipping) hits a single 15 GB cloud server could generate every day.
Automated streaming is a lucrative heist involving robots emulating humans, but I did not encounter many Turing tests during my dry run. There wasn't even a CAPTCHA or email verification when creating accounts. The barriers to entry are clearly minimal.
A Spotify representative assured me that the company employs both computerized algorithms and human review to identify albums with questionable streaming activity, but declined to tell me how many albums have been removed for suspected fraud.
We do have one data point: Fillmore's album was taken down about six months after he began streaming songs once every thirty seconds (the minimum duration to accrue a royalty payment) from high-paying premium Spotify accounts. He suspects it was because of user complaints about the quality of his music.
One can imagine, however, that if streaming robots can approximate human listener behavior well enough, a sophisticated botnet operation could plausibly fool Spotify's spam algorithms.
As much as I love the idea of having an army of robots working feverishly to bring me riches, my conscience prevents me from doing it. To understand why, one needs to realize where the money comes from.
Here is Spotify's basic business model: The site takes the total revenue from ad sales, which totaled $117 million in 2014, and pockets 30 percent. The remaining 70 percent of the ad money is shared between rights holders, based on the number of plays they receive. For instance, if I held rights to 10 percent of the total free Spotify streams during 2014, I end up with 7 percent of that $117 million pot (minus publishing fees).
By adding meaningless plays to the denominator of that sharing formula, automated streaming lowers the per-stream royalty rate for all other rights holders.
Does that mean a bot wrangler would be sticking it to the record business fat cats?
Major music labels are insulated from streaming fraud because they negotiate a much more complex compensation package with Spotify than the simple formula outlined earlier. Sony negotiated terms including multi-million dollar advances from Spotify, and "usage-based minimums" which guarantee fixed per-stream royalty rates even if bots drag the shared-model rates to new lows.
Instead, independent musicians and small labels that self-publish would likely bear the brunt of the damage from automated streaming because their royalty rates are the most flexible. Advertisers also suffer because they are paying for ad time that is falling on robot ears.
If automated streaming continues unabated, independent artists who rely on ad-supported listeners will see their royalties shrink
We can predict how small royalties may become by thinking of the situation as arbitrage. The royalty payout for playing a song currently exceeds the cost of required server time. If automated streaming continues unabated, independent artists who rely on ad-supported listeners will see their royalties shrink, possibly to the vanishing cost of server time (0.003 to 0.012 cents per stream).
At that point, automated streaming from the cloud will become unprofitable—unless spammers decided to infect swaths of computers with malware that would quietly stream fake Spotify listens without the user noticing. This kind of malware-driven botnet is a cheap way to mimic a lot of listener activity, and could end up forcing the value of a Spotify listen down even further if deployed on a large scale. Real hackers might switch to using stolen premium accounts for even juicier payouts, and the same race to the bottom would occur at the premium tier.
If they want to save the profitability of streaming, both independent artists and advertisers should call on music streaming services to combat streaming fraud however possible. Spotify and other services could accomplish this by taking listener authenticity seriously, and perhaps by splitting revenues more fairly.
I have focused on Spotify out of familiarity, but the effectiveness of botnets in taking a cut from shared revenue pools is nearly universal. Traditional click farms make web pages look like they drive more traffic than they really do, and video streaming services already have the unpleasant task of wiping billions of suspect views from their ledgers.
But perhaps there is hope for music: I polled my musician friends on whether they would collaborate with me if I hypothetically attempted to unleash this monster into the streaming world for profit. They didn't seem too interested in such diabolical plots. With any luck, they are a representative sample.
- motherboard show
- click fraud
- automated streaming
- click farming
- streaming audio
- the listening economy