What do they mean with spatial audio, and what is the U1 chip?
Just two things stand out to me in this new iPhone, and they’re barely footnotes both in the keynote and on the official site.
The new iPhones support “spatial audio” and Dolby Atmos. They’re hardly the first ones to support binaural decoding of Dolby Atmos, but from the way spatial audio is shown graphically in the presentation it looks like they’re referring to a loudspeaker array that lets you hear spatialised sound straight from the phone, without wearing headphones, through some kind of beam-forming.
The only mention of spatial audio I can find on the Apple website is on the developer section. The release notes for iOS 13 mention this new feature:
A new rendering mode in
AVAudioEnvironmentNodeselects the best spatial audio rendering algorithm automatically based on the output device.
This is the AVAudioEnvironmentNode documentation page. It looks like a complete 3D sound engine, but at a glance I can’t actually find the list of supported output types, so I don’t know whether it can target a loudspeaker array or just different types of binaural decoding.
It should be noted that Apple has known experience in building loudspeaker arrays, the ones in the HomePod are quite impressive, and they hold patents on the subject.
There aren’t a lot of details available yet, but it’s confirmed that the new iPhones include a new chip that allows precise mutual-localisation of nearby devices. The Verge already wrote about it here. My first question is whether the new Apple Watch also has that chip, but it’s not mentioned anywhere.