Microsoft Soundscape: Spatial Audio design for ambient awareness of the physical environment

As the Interaction and Sound Designer for Microsoft Soundscape (a team within the Enable Group at Microsoft Research), I drive the end-to-end design of our app’s spatial audio experience.  I approach this role with great compassion and admiration for our users, as Soundscape is a navigation tool that explores the innovative use of 3D audio cues to support safe and confident wayfinding primarily for the Blind and Low Vision community.  

The Soundscape audio experience relies on 3D audio cues and TTS narration as an audio information display overlaid on top of the world around you.  Since the physical surroundings are endlessly variable for our users across upwards of 10 countries and counting, the sounds are designed to both efficiently cut through the noise floor of a dynamic acoustic environment, while also not exciting anxiety levels or annoyances for our user.  Oh yea, and one more thing - we strive to make it sound and feel beautiful! 

Here are examples of the sounds I’ve made for Soundscape:

An average workflow goes something like this:  I explore a new feature concept alongside our Engineers and team leaders, or perhaps in a collaboration with one of our partner groups, and then I start sketching and iterating on sounds in Ableton with my arsenal of software (and sometimes hardware) synthesizers and effects.  These are often laid into Audio Wireframes that help to flesh out the User Journey or Scenarios under investigation.  These renderings often require spatial mixing and some background binaural or ambisonic field recordings to give context, and so at this point I bring the work into Reaper.  Depending on the project, I will sometimes then turn these into video mockups as well to best illustrate the concepts to our collaborators.  

If Interactive Prototyping is required (I always hope it is!) I shift over to Unity and produce an app that is often the integral element of a User Study.  Laying the sounds into a prototype is always incredibly useful, as there is just no way with a wayfinding app to know quite how the sounds and interactions will feel until you experience the realtime sensory feedback loop of using the phone and listening during Play Testing.  If relevant, I’ll build in quantifiable outcomes from the testing (always preferred at MSR).  Otherwise, I’ll put together a questionnaire to drive useful feedback about the prototype, as impressions about sounds are always subjective and unique to each person, and containing the responses to actionable findings is always a critical challenge during this stage.

We’re a small team that embraces the Lean approach of Build - Measure - Learn, in many ways functioning like a startup, and it’s awesome to handle the full audio design life cycle before passing off the new product sounds to Engineering.  

It’s an honor to work on a project like Soundscape where I get to combine my love of sound with making a difference in the lives of our Blind and Low Vision users.  Lately I’m leading our audio experience design recommendations with new partners as we bring the Soundscape experience to collaborations with Bose, Microsoft’s Future of Work initiative, Microsoft Teams and Surface, Bing Maps, and Adaptive Sporting events around the world — it’s an incredibly exciting time for inclusive spatial audio experiences surfacing up to the mainstream!

Here’s a Soundscape primer with Microsoft CEO Satya Nadella and our team’s fearless leader Amos Miller.

Bose Frames AR headset, bluetooth headtracking with compass/gyro/accelerometer and quality audio spatialization

Bose Frames AR headset, bluetooth headtracking with compass/gyro/accelerometer and quality audio spatialization

Much of our current work is under NDA, but as an example from a couple years ago.. here are two views of a Unity-built iPhone prototype I made for a User Study. Builds were distributed to team members’ phones via App Center, and each had Bose Frames headsets. As you can see, the app was designed as a sort of UI menu, which allowed me to quantify their preferences. I asked them to turn up the ambisonic soundscapes and lower the volumes using an onscreen slider of our potential beacon designs to the point at which they were still audible and localizable.

This provided us with quantitative data that expressed the efficiency of each design by modeling the asset in a real world interaction alongside the everpresent safety concerns of our product. It was very interesting to see the data stack up in favor of the higher frequency beacon sounds’ quantifiable efficiency, though qualitative feedback showed an average user preference for a lower pitched beacon asset.

af_ambi_IMG_0153.PNG
sa_image006.png