Skip to main content

Auto Shazam feature described in patent application, can work with AirPods, Apple Headset

You can already switch on an auto Shazam feature on your iPhone, but Apple is envisaging a more intelligent version, which could work with anything from AirPods to an Apple Headset

Current Shazam options

Currently, you can either manually ask Shazam to identify a song, or set an auto Shazam function to continually listen and log details of all songs played.

The manual option works on all Apple devices, including the Apple Watch. Auto Shazam only works on iOS devices (though most of you would like to see it on the Apple Watch, and we have requested it). On your iPhone, long-press the Shazam button until you see auto Shazam active.

With auto Shazam, your iPhone will store details of all the songs it identifies until you cancel it. Even though this leaves the microphone active, Apple says that there is no privacy risk.

When Auto Shazam is on, Shazam will match what you’re hearing with songs in the Shazam database – even when you switch to another app. Shazam never saves or stores what it hears.

You can also set auto Shazam to activate as soon as you open the app.

To make Shazam start listening automatically when you open the app on your iPhone or iPad, swipe up to My Music from the main Shazam screen, tap the Settings button, then turn on “Shazam on app start.”

Intelligent auto Shazam

The technology described in the patent application goes beyond this. It would be fully automatic, working out which songs you might like to identify from how your body responds to the music.

Patently Apple spotted the application.

The patent suggests that a next generation of this app will function on many more devices (headphones, an iPhone, a Mixed Reality HMD, an iPad, smart contact lenses, a heads-up display on a vehicle windshield etc.).

Device resources may be used efficiently in determining that a user is interested in audio content. This may involve moving through different power states based on different triggers at the device.

For example, audio analysis may be performed selectively, for example, based upon detecting a body movement, e.g., a head bobbing, foot tapping, leap of joy, first pump, facial reaction, or other movement indicative of user interest.

I’d love to see the algorithm for “Leap of joy” detection …

Apple acquired Shazam back in 2018.

Photo: Omid Armin/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear