Why Fraud Is The Boring Problem
Michael Smith used AI to create music, and then used AI to create bots to get the “plays” and took the smartest technology companies, including Spotify and Amazon, who should know better, for about $8 million. He is going to jail for his crimes. It is easy to dismiss this as one-and-done fraud. It is anything but. It is an early warning of how AI will disrupt the systems that power our digital society: how culture gets discovered, how commerce gets directed, and how conversations get shaped.
At present, most of our digital society is powered by tech that is, generically speaking, recommendation algorithms. Spotify’s Discover Weekly, YouTube’s suggestion engine, TikTok’s For You page, Amazon’s product feed, Facebook’s news feed. All of it runs on signals of human behavior. Stream counts, completion rates, saves, shares, playlist adds, clicks, purchases. These represent people making choices. The only way to fake that was to hire a back-office army to do it. This is cultural legitimacy laundering.
Scale can now be had for cheap. Almost free. Who is to say that with AI, there won’t be bots that learn and adapt (agents, in polite company) designed to game the system? What if real artists with real music have this army to goose up their stream counts. The algorithm then promotes them to real ears.
What if it is not just real artists, but AI music being created by music-making software models that get better every month. The algorithm promotes this too. Humans like it, save it, share it, and add it to their playlists. They are generating real signals on top of the fake ones. At what point does fraudulently-obtained popularity become real popularity? There’s no clean line.
Just like Uber was limos for everyone, this is payola for anyone, for $200 a month. Even cheaper, if open-source models have their way. You think musicians won’t do it? Look around. They are already buying bots to inflate their numbers using gray-market services. Smith might be the idiot going to jail, but we are facing a structural collapse of the discovery and taste-making apparatus.
In the case of music, since royalties are countable, you can make a case for fraud. In other arenas, things are not going to be as simple. What gets bought on Amazon, what trends on Facebook, what becomes culturally popular. All of it runs on the same logic as Spotify. Signals of human behavior, gamed by machines. AI is making authenticity optional.
Whether it is Spotify directing culture, Amazon directing commerce, or Facebook directing conversations, I have yet to hear from any of their leadership on how they plan to redesign what they do because of the coming onslaught of fake signals.
Smith used crude tools to steal $8 million. His fraud is the boring version of the problem. The interesting version hasn’t been prosecuted yet. It may never be.
Further Reading
- The IFPI’s Global Music Report 2026, published March 18, called streaming fraud “theft, plain and simple.” The price tag is $2 billion a year, according to music data firm Beatdapp, out of a $22 billion streaming economy.
- Apple Music flagged and demonetized 2 billion fraudulent streams in 2025 alone. Two billion is the number they caught. The number they missed is unknown.
- In January 2026, Deezer reported receiving over 60,000 fully AI-generated tracks daily, with 85% of streams on AI-generated music in 2025 classified as fraudulent, a 70% increase from the previous year.
- Businesses now openly sell streaming fraud as a service. WIPO’s breakdown of how streaming farms operate makes the gray-market description look generous.
- In 2024, automated bot traffic surpassed human traffic on the internet for the first time in a decade, with bots now accounting for 51% of all web traffic. Cloudflare blocked 13 trillion bad bot requests in 2025.
- All eight major social platforms tested in 2024 failed to detect advanced AI-created bots, and commercial anti-bot services were evaded more than half the time.