🎡 Apple Music: Closing the Personalization Gap

Smart Autoplay 2.0 β€” A Hypothetical PM Case Study


πŸ’¬ A note before we dive in

This is a hypothetical case study, but it started with a real frustration. When I switched from Spotify to Apple Music, the gap was immediate. Spotify had learned me over years. It knew that if I was deep in an Eminem session, the next song should hold that energy. It knew that a skip was a signal, not just a tap. It set the mood and held it.

Apple Music didn't do any of that. Songs would jump genres mid-session. The same tracks kept cycling back. Autoplay felt like shuffle on a very large, very random library.

After a few weeks of manually managing every session, I stopped complaining and started asking: what would it actually take to fix this? This case study is my answer.


πŸ“‹ At a Glance

Product Apple Music β€” Smart Autoplay 2.0
Role Product Manager
Scope 0-to-1 feature initiative
Timeline 6–12 months (Proposed)
Team ML Engineers, iOS Developers, Design, Privacy
Core Problem Apple Music's personalization engine cannot compete with Spotify's real-time, behavior-driven ML system

The Problem

Millions of users switch from Spotify to Apple Music every year β€” drawn by the ecosystem, the audio quality, the pricing. A significant chunk of them quietly switch back. Not because of the library. Not because of the price. Because the listening experience feels broken by comparison.

Spotify's autoplay reads the room. Apple Music's plays whatever it wants.

That's not a small gap. Personalization is Spotify's core moat β€” and Apple Music is losing users because of it.

When I dug into this beyond my own experience, the signal was consistent across every source I checked: