Since their release in late 2016, Apple’s AirPods wireless headphones have become a bonafide cultural phenomenon—while dominating the industry, to boot. In 2020, AirPods made up 29 percent of the wireless headphones market, more than twice as much as their nearest competitor.
It’s hard to overstate how valuable the AirPods brand is to Apple. Analysts estimate that the company sold roughly 60 million AirPods in 2019. With an official retail price of $159 per pair (not including the more expensive wireless charging case, or the high-end AirPods Pro), that means Apple’s annual AirPods revenue is over $9.5 billion—larger than the GDP of dozens of countries, as well as established brands like Kellogg’s and the entire CDB industry.
That’s why Backtracks is excited to announce a new innovation for users of Apple’s AirPods Pro: detecting a user’s vertical and horizontal head nods, without any need for a camera or voice or touch input. Backtracks’ new head motion detection is faster than voice assistants such as Siri, and allows users to easily interact with audio, media, and apps just by nodding or shaking their heads.
This development opens up a variety of exciting possibilities and use cases for creators, developers, publishers, advertisers, and users themselves—both within and beyond the media and entertainment industries.
AirPods as a Platform: What Comes Next?
When everyone has a smartphone, what’s in line to become the next hot technology? Much has been written about the so-called “slowdown” in the global smartphone industry due to market saturation, with sales level or even declining. Essentially, the vast majority of people who want smartphones have already purchased them: the Pew Research Center found that 81 percent of Americans own a smartphone (and 96 percent own any mobile phone).
What’s more, despite the ubiquity of smartphones, there are certain features they can’t have due to the hardware limitations of the device. In the case of audio, of course, smartphones are limited to their own internal speakers, or external ones they can connect to via Bluetooth—but neither can offer a truly immersive, interactive audio experience.
The smartphone plateau is at odds with the current boom in music streaming, podcasts, audiobooks, and other audio content. Deloitte, for one, estimated that the audiobook industry would grow by 25 percent in 2020, while podcasts would grow by 30 percent, becoming a billion-dollar industry for the first time. Jennifer Kavanagh, senior vice president for marketing & media with the Philadelphia Eagles, has predicted: “Audio as a medium will grow, and as it grows, it will give rise to earfluence… This is a concept where we see more people spending time listening instead of watching, either because it’s more convenient, or the medium resonates more.”
So given the current boom in audio content, and the limitations of the smartphone, what’s next? Many have observed that smartphone-adjacent technologies such as AirPods are perfectly poised to capitalize on these trends—perhaps in conjunction with other incipient platforms such as the Apple Watch. For example, a user’s AirPods could automatically adjust their music playlist based on their heart rate as measured by their Apple Watch, e.g. playing workout music during times of peak activity (or conversely, soothing tunes when they need to calm down).
In particular, industry thought leaders such as Jordan Cooper have written about the potential of AirPods and other wireless headphones to become the next tech “platform,” similar to other emerging technologies such as virtual reality, smart cars, and cryptocurrencies. Independent Apple analyst Neil Cybart agrees, writing:
“In just three years, AirPods have evolved from an iPhone accessory into the early stages of a platform well positioned to reshape the current app paradigm for the wearables era… Apple is turning AirPods into the second platform built for what comes after the App Store. Instead of being about pushed snippets of information and data via a digital voice assistant, something that will likely remain ideal for mobile screens, AirPods will be all about augmenting our environment by pushing intelligent sound.”
Most headsets on the market offer a basic but limited set of actions by tapping on a button: play/pause, volume up/down, skip, etc. Julian Lehr has explicitly identified the need to add more ways to interact with devices like AirPods:
“Why has no one thought about additional buttons or click mechanisms that allow users to interact with the actual content?… [It] doesn’t have to be a physical button. In fact, gesture-based inputs might be even more convenient. If AirPods had built-in accelerometers, users could interact with audio content by nodding or shaking their heads.”
By adding head motion and gesture detection for AirPods for the first time, Backtracks is enabling listeners, publishers, and advertisers to take full advantage of this new dimension.
How Does AirPods Head Movement Detection Work?
First, Backtracks’ head motion and gesture detection technology captures data from AirPods Pro and Apple device sensors. This data is then layered in audio analytics that are processed by Backtracks’ audio and podcast analytics web services.
The Backtracks Native SDK also includes functionality for user activity detection. This feature makes use of the sensors in phones and watches to capture analytics and data. Backtracks collects data in a privacy-first manner compliant with GDPR and CCPA, without using the camera or any personally identifiable user data, so that information security and user privacy remain intact.
All of these capabilities can be combined with the existing capabilities of the Backtracks Native SDK, including:
- Detecting plays, pauses, listening time, content completion, engagement rates, etc.
- Capturing audience data (e.g. geographic location)
- Smart speed: reducing audio silences for faster listening
- Voice boost: increasing the loudness of human voices in audio
Backtracks’ Native SDK is available for Apple, Android, and React Native:
- The Backtracks Apple SDK works across Apple’s operating systems (iOS, macOS, iPadOS, watchOS, tvOS) and devices, including iPhone, Mac, Apple TV, Apple Watch, etc.
- The Backtracks Android SDK’s activity detection works on Android and Wear OS 2.0 operating systems and supporting devices.
- The Backtracks Native SDKs audio analytics functionality works across the entire Apple and Android platforms, including laptops, phones, tablets, watches and smart TVs.
Use Cases for AirPods Head Motion and Gesture Detection
Backtracks’ new head motion detection capabilities for AirPods Pro have unleashed a wide range of possibilities—below are just four ideas.
1. Interactive Audio Content
Perhaps one of the most exciting use cases for this new technology: producing interactive audio content.
For example, users could listen to a “choose your own adventure” story that pauses at certain intervals, allowing them to make a binary decision by nodding or shaking their head. Once they’ve made a particular choice, the story branches off in a different direction, offering a fresh new storytelling experience each time.
2. New Ad Formats
Interactive audio content is obviously exciting for listeners themselves, but also holds great promise for marketers looking for new ways to connect with their leads and customers. The global music streaming market is predicted to reach $76.9 billion by 2027 with an annual growth rate of 18 percent, which makes it a highly intriguing area of investment for advertisers.
Audio streaming platforms such as Spotify have already experimented with interactive voice ads that ask listeners to say a short phrase such as “yes” or “no,” varying the contents of their response based on this feedback—e.g. offering a joke or a food hack for listens who respond positively.
Of course, nonverbal gestures such as nodding and shaking one’s head can work just as well as verbal responses. What’s more, responding nonverbally is often more convenient, especially in quiet or noisy situations. With Backtracks’ new AirPod head motion capabilities, marketers can unlock new ways for listeners to (easily and discreetly) engage with their ads and products.
3. Spatial 3D Audio Remixing
In addition to interactive audio content, head motion detection enables listeners to take full, active control of their audio experience. AirPods Pro are equipped with a spatial audio with dynamic head tracking feature, making users feel as if the sound from a film or video is truly coming from their surroundings.
Combined with Backtracks’ head motion detection, spatial 3D audio remixing offers new possibilities for users to engage with audio content. If a user tilts their head to the left, for example, the AirPods can amplify the sound in their left ear.
4. Location- and Motion-Specific Content
Thus far, most of the attention on virtual and augmented reality (VR/AR) has focused on visual applications, i.e. immersion through the use of VR goggles. But what does “audio AR”—enhancing what people hear based on their surroundings—have to offer?
By augmenting head gesture detection with location and motion information, the Backtracks SDK offers nearly limitless possibilities. For one, developers can detect if an AirPods user is walking, running, biking, or sitting in a moving or stopped vehicle. They can also collect this information over time: e.g. if a user has been running for 5 minutes, or if they’re in a car and stuck in traffic.
Publishers and advertisers can then customize the content and ads they offer based on this new spatial information. As just one example, listeners who are stuck in a traffic jam might benefit from hearing a PSA to use public transit.
These four use cases are just a tiny sample of what’s now possible thanks to Backtracks’ head movement detection for AirPods Pro. Backtracks believes that Airpods offer an exciting new arena where listeners, creators, publishers, and advertisers can converge, all profiting from the creative possibilities for immersion and interaction. To get started exploring what you can do with this new feature, check out the Backtracks SDK.