Apple’s personal spatial audio trick is really Sony’s idea

One of the new features in iOS 16, and something it was It was highlighted again during an Apple event on Wednesday, is the ad hoc spatial sound. Once you have the latest iOS version installed on your iPhone starting September 12th, you’ll be able to create a custom audio profile that will improve the sense of immersion and the overall spatial audio experience you get from AirPods.

To produce this custom tuning, Apple uses your iPhone’s TrueDepth front camera to scan your ears. The process, which involves holding the iPhone about 10 to 20 cm from the side of your head, takes less than a minute, and the resulting data is then used to enhance the spatial sound of your unique ear shape. “The way we perceive sound is unique, based on the size and shape of our head and ears,” Apple’s Mary Ann Rao said during the keynote. “Custom spatial sound will provide the best immersive listening experience by precisely positioning the sounds in the space that has been tuned just for you.”

But Apple isn’t the first company to take this route. Sony has introduced “360 Reality Audio” since 2019 to supported music services like Amazon Music, Tidal, Deezer, and Nugs.net. Conceptually, it’s very similar: Sony and Apple try to determine the structure of your ear and adjust spatial sound processing to take into account the unique folds and contours of your ears. The goal is to preserve the 3D audio experience and eliminate any audio quirks that reduce sensation.

Here’s how Sony explained the benefits to me back in June, courtesy of Kaz Makiyama, Vice President of Video and Audio at Sony Electronics:

Humans can identify spatial sound sources by subtle shifts in the intensity and time of sound entering the left and right ears of the sound source. In addition, the sound may depend on the shape of our head and ear. Therefore, by analyzing and reproducing the characteristics of both ears by taking pictures of the two ears, this technology makes it possible to reproduce the sound field while using headphones.

However, Sony’s approach is a little more awkward than that of Apple. AirPods technology is built into iOS settings. But to build a custom sound field with Sony products, you have to take an actual picture of each ear using the Headphones Connect app and your phone’s camera.

These images are uploaded to Sony’s servers for analysis – then Sony keeps them for an additional 30 days so they can be used for internal research and feature improvement. The company says that the ear images are not associated with you personally during this window.

This also does not mean that Apple has succeeded in performing the ear scan completely. Throughout the iOS 16 beta period, some have been through social media reddit You mentioned that the process can be tedious and sometimes it fails to detect the ear. I guess the truth of the matter is that there is no dead simple way to get rid of that also Get a good and accurate reading of your ear shape.

There seems to be a consensus that it’s worth the effort: these profiles often make a noticeable difference and can improve our perception of spatial sound. And Apple doesn’t take actual photos: The TrueDepth camera takes a depth map of your head and ear, the same way Face ID recognizes your facial features.

The Apple website notes that once you create a personal spatial audio profile from your iPhone, it will sync across your other Apple devices, including Macs and iPads, to maintain a consistent experience. This will be true starting at least October: You’ll need upcoming macOS and iPadOS updates for sync to work. Dedicated spatial audio will be supported on third-generation AirPods, and both AirPods Pro and AirPods Max generations.

Apple has never claimed to go from scratch with custom spatial audio. Company executives routinely state that their goal is to come up with the best implementation of meaningful features, even if others — in this case, Sony — are already pushing in that direction.