One of the new things in iOS 16, and something that was highlighted again during the Apple event on Wednesday, is custom spatial audio. Once you’ve installed the latest version of iOS on your iPhone as of September 12, you’ll be able to create a custom sound profile that should improve the sense of immersion and overall spatial audio experience you get from AirPods.
To produce this custom tuning, Apple uses the iPhone’s TrueDepth front-facing camera to scan your ears. The process, which involves holding the iPhone about 4 to 8 inches from the side of your head, takes less than a minute, and the resulting data is used to optimize spatial audio for the unique shape of your ear. “The way we all perceive sound is unique, based on the size and shape of our heads and ears,” said Apple’s Mary-Ann Rau during the keynote. “Custom Spatial Audio will deliver the most immersive listening experience by precisely placing sounds in space that are tuned just for you.”
But Apple is not the first company to follow this path. Sony has offered “personalized 360 Reality Audio” since 2019 for compatible music services such as Amazon Music, Tidal, Deezer, and Nugs.net. Conceptually, it’s very similar: both Sony and Apple are trying to determine the structure of your ear and adjust spatial audio processing to account for the unique folds and contours of your ears. The goal is to maintain that 3D audio experience and remove any audio quirks that detract from the feel.
Here’s how Sony explained the benefits to me in June, courtesy of Kaz Makiyama, vice president of video and sound at Sony Electronics:
Humans are able to recognize spatial sound sources by the subtle changes in intensity and timing of sound entering the left and right ears from the sound source. Also, the sound can depend on the shape of our head and ear. So, by analyzing and reproducing the characteristics of both ears by taking pictures of the ears, this technology enables the reproduction of the sound field while wearing headphones.
Sony’s approach, however, is a bit more awkward than Apple’s. AirPods technology is built into the iOS settings. But to create a custom sound field with Sony products, you need to take a real photo of each ear using the Headphones Connect app and your phone’s camera.
These images are uploaded to Sony’s servers for analysis, and are then retained by Sony for an additional 30 days so they can be used for internal research and feature improvements. The company says that the photos of the ears are not personally associated with you during this window.
That’s also not to say that Apple has completely nailed the ear scanning procedure. Throughout the iOS 16 beta period, some on social media and Reddit have mentioned that the process can feel tedious and sometimes fails to pick up an ear. I think the truth of the matter is that there is no easy way to accomplish this while what’s more get a good and accurate reading of the shape of the ear.
The consensus seems to be that it’s worth the effort: these custom profiles often make a noticeable difference and can improve our perception of spatial audio. And Apple isn’t taking real photos: the TrueDepth camera captures a depth map of your head and ear, much like Face ID learns your facial features.
Apple’s website notes that once you’ve created a custom spatial audio profile from an iPhone, it will sync with your other Apple devices, including Macs and iPads, to maintain a consistent experience. That will be true from October at least: you’ll need the upcoming macOS and iPadOS updates for sync to work. Custom spatial audio will be supported on 3rd generation AirPods, both generations of AirPods Pro, and AirPods Max.
Apple has never claimed to be achieving any firsts with custom spatial audio. Company executives have routinely stated that their goal is to achieve the best execution of significant features, even if others, in this case Sony, were already pushing in that direction.