Voice managed exercises with Google Assistant


Posted by John Richardson, Accomplice Engineer

With tens of tens of millions of installs of the adidas Running app, customers on daily basis flip to adidas as a part of their well being and health routine. Like many within the business, adidas acknowledged that on this ever-evolving market, it is essential to make it as straightforward as doable for customers to attain their health objectives and making their app obtainable on Put on was a pure match. adidas didn’t cease at bringing their operating app to the watch, nonetheless, additionally they realized that in a consumer state of affairs equivalent to a exercise, the power to interact with the appliance hands-free, and even eyes-free, additional simplified how customers might have interaction with the app.

Integrating Google Assistant

To allow hands-free management, adidas appeared to Google Assistant and App Actions, which lets customers management apps with their voice utilizing built-in intents (BIIs). Customers can carry out particular duties by voice or act upon duties equivalent to beginning a run or swim.

Integrating Health and Fitness BIIs was a easy addition that adidas’ employees Android developer made of their IDE by declaring <functionality> tags of their shortcuts.xml file as a way to create a constant expertise between the cell app and a watch floor. It’s a course of that appears like this:

  1. First, Assistant parses the consumer’s request utilizing pure language understanding and identifies the suitable BII. For instance, START_EXERCISE to start a exercise.
  2. Second, Assistant will then fulfill the consumer’s intent by launching the appliance to a specified content material or motion. In addition to START_EXERCISE, customers may also cease (STOP_EXERCISE), pause (PAUSE_EXERCISE), or resume (RESUME_EXERCISE) their exercises. Haptic suggestions or dings can be added right here to point out whether or not a consumer request was profitable or not.

With App Actions being constructed on Android, the event crew was capable of deploy rapidly. And when partnered with the Health Services and Health Connect APIs which respectively assist real-time sensor and well being information, finish customers can have a cohesive and safe expertise throughout cell and Put on OS units.

Miving image illustrating adidas Running app launching via Google Assistant on a wearable device

“What’s thrilling about Assistant and Put on is that the mixture actually helps our customers attain their health objectives. The flexibility for a consumer to leverage their voice to trace their exercise makes for a singular and really accessible expertise,” says Robert Hellwagner, Director of Product Innovation for adidas Runtastic. “We’re excited by the potential for what will be executed by enabling voice primarily based interactions and experiences for our customers by App Actions.”

Study extra

Enabling voice controls to unlock hands-free and eyes-free contexts is a straightforward method to create a extra seamless app expertise in your customers. To deliver pure, conversational interactions to your app read our documentation in the present day, discover find out how to construct with one among our codelabs, or subscribe to our App Actions YouTube playlist for extra info. You too can signal as much as develop for Android Health Connect in case you are interested by becoming a member of our Google Well being and Health EAP. To leap proper into how this integration was constructed, study extra about integrating WearOS and App Actions.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button