As we’ve written about before, hearables have skyrocketed in use since the arrival of Apple AirPods. New smart hearables including Amazon Echo Buds, Samsung Galaxy Buds, Google Pixel Buds, and even Microsoft Surface Buds offer hands-free access to each company’s respective assistants. From 2018 to 2020, voice assistant use through hearables increased 103% from 21.5 million to 43.7 million adults.
As the health and wellness industry continues to enjoy rapid growth, smart wearables catered to this sector have become the norm to manage healthy habits. Apple Watch, FitBit, and Samsung Galaxy Watch embed tracking capabilities including heart rate, sleep, and fitness in their watches to provide a comprehensive purview of your health. Amazon’s new Halo wristband offers even more advanced health features, such as monitoring voice tone, sleep, temperature, and even mapping out body tone.
Smart eyeglasses are also emerging as innovative channels to incorporate voice-enablement, though they’ve yet to go mainstream. Amazon released Amazon Echo Frames and Echo Loop, smart glasses, and a smart ring as portable devices with access to Alexa capabilities. Google Glass, Google’s AR headset, is available for enterprise use, and in July, Google acquired North, an AR glasses company, in a possible attempt to revive Google Glass for consumer use.
First, if we have access to voice-enabled technology at home, we’ll expect that our on-the-go devices will come equipped with the same capabilities. Second, device owners should expect that with all the data being shared and collected by these smart devices, they will become more intelligent and provide them with better and more personalized experiences.
We expect to see these trends gain traction in the market:
In the future, we see wearable technology tapping into A/R and location data for wayfinding and directional experiences. With these kinds of integrations, wearables will be able to help you navigate certain environments and settings or send location-specific information straight to your devices. Imagine trying to find a specific product in a shopping mall with a sprawling layout—a voice-enabled hearable could provide directions while your smart glasses show directional overlays as you walk through the mall. This kind of experience is within reach.
As wearables, especially smart watches, monitor our habits and connect to our phones and mobile apps, we believe that these devices will deliver personalized information catered not only to your interests and activities, but to your body itself. The health data and biometric triggers that some assistant-integrated devices now track could be used to provide recommendations based on different signals and key health metrics. For example, if you were at the gym or going for a run, your assistant could tell you when to cool down, switch to a different exercise or speed, and help recommend recipes or meals to refuel when you get home. The possibilities here are expansive.
As we give more and more access to our devices and assistants that keep tabs on our personal data and information, tech companies need to develop clear standards on how they are going to appropriately manage this user information and shift control over to users to maintain transparency. Much like Apple and others have made it possible to limit what data is collected or when it is allowed to be collected, technology and assistant providers will need to enable similar calibrations for users. We’ve already seen Amazon and Google take steps to provide users control over their voice histories, so these settings will need to expand as assistants have access to more intimate data.