These glasses have sensors that can detect your movements, and they connect via Bluetooth to your smartphone to fetch Global Positioning System data and determine which way you are looking and moving. The glasses can send sound to your ears from a specific direction.
Now they will be enhanced with Mach1’s unified audio framework, which gives content creators and developers an easy way to create spatial audio — with a future-proofed pipeline that is platform, format, and codec agnostic.
Bose AR is an audio-first approach to augmented reality, and the Bose AR-enabled products have motion sensors embedded inside to detect a user’s head orientation and body movement. Bose AR-enhanced apps can use that information, along with location data from the user’s mobile device, to provide tailored audio content.
Mach1 Spatial enables spatial and positional audio rendering for content creation (encoding) and playback (decoding). The Mach1 Spatial SDK directly utilizes the sensors from the Bose AR-enabled products to bring a range of spatial audio and multichannel handling features to any custom application.
The Mach1 Transcode API from the Mach1 Spatial SDK also enables and supports all major multichannel channel surround and spatial formats, making it easy to support any content.
Founded by Dražen and Jacqueline Bošnjak, Mach1’s spatial sound production has been used on some of the industry’s most high profile virtual reality projects, including The Martian VR, Alien VR, Chained, and Dear Angelica.