Lessons learned from Google Assistant and App Actions on Android

Julien Salvi
8 min readJun 8, 2021
Photo by Soundtrap on Unsplash

My company, Aircall, recently organized a hackathon to unleash our tech ideas for one and a half day. With the iOS folks, we decided to experiment voice interactions with Siri and Google Assistant in order to call a contact or show a specific data outside the application’s scope with Android Slices. In this article, I’ll show what we learned from Google Assistant and how we used App Actions to bring voice controls to our app.

Identifying the features

First, we wanted to identify the features we can bring to our application, thanks to Google Assistant. Our main app is a VoIP phone for businesses, so the first idea was to call someone from your contacts with the voice like “Ok Google, call John Doe on Aircall”. Google Assistant offers the App Actions to achieve such goals on Android.

Then, we thought about bringing relevant data to our users outside the application using voice actions. To do so, we will use the Android Slices to bring what’s on the to-do of a user in Google Assistant with a command like the following: “Ok Google, show my to-do list on Aircall”.

Let’s see how we implemented these two features in the Aircall app.

Let’s play with App Actions and Slices

First, let’s add the dependencies to the project. We need to add the Firebase Core and the App Indexing libraries to make the (deep)link between Google Assistant and our app. To display interactive cards on Google Assistant, we are going to use the AndroidX Slice library. Check for the latest updates of the Slice library here.

Gradle dependencies

Make a call with App Actions

To interact with our application, we need to define some actions in an actions.xml file located in the xml folder of the resources. Each action is associated with a built-in intent (BII) which model some of the common ways that users express tasks. App Actions provides over 60+ BIIs.

Once the file is created, we are going to create the dedicated action to call a number or a contact, thanks to the intent actions.intent.CREATE_CALL. To create a new call, we define two fulfillment represented by an URL that will be processed during the deeplink. In our case, each fulfillment has a parameter in order to get the related data (a phone number or a contact name) from the Google Assistant query.

Once the action is defined, we need to refer this file in the Manifest of the application and tell which Activity will be the entry point to process the deeplink with the following snippet:

Now we are all set to handle the deeplink. In the dedicated Activity, we are going to get the data that comes from Google Assistant in the Intent. First, we need to filter the intent by its action and then we need to extract the path of Uri in order to know which action is going to be triggered.

Once you know which action to handle, you can retrieve the data from the query with a given parameter with the method getQueryParameter(). These parameters have been defined previously in the your actions.xml file.

And we are done! From this point, we have all the data we need to start a call or search for a contact and then start the call. It’s up to you how you process the data.

Now let’s see how we can show relevant information outside your application in Google Assistant with Android Slices.

Show to-do items with Android Slices

Thanks to Android Slices, you can bring interactive cards to Google Assistant to show some of your app data that your user might found useful. When we saw this feature, we thought it would be a good idea to show what’s on the to-do of the user with the query: “Ok Google, show my to-do list on Aircall”.

First, let’s add a new action to the actions.xml file. The action targets the existing built-in intent actions.intent.GET_ITEM_LIST which is dedicated to search and view personal lists and public collections. Then, we define the fulfillment for the action with a given URL and a mode (actions.fulfillment.SLICE) to display the slice in Assistant. We will use the name and category parameters to match the term “to-do list” in the voice query. In the end, we’ll add a fallback fulfillment that will deeplink to the to-do screen in the app in case the slice cannot be displayed.

Now, let’s see how the Slices are handled in the code. Android Studio provides a wizard to create a SliceProvider. On the package you want, right click then New -> Other -> Slice Provider to open dialog to create your new provider. You must provide a class name, URI authorities, host URL and the path prefix.

Before digging to into the SliceProvider, let’s grant the Assistant permission in the Application class. We need to build the provider Uri with a given authority, the same we used to build the SliceProvider, and pass it to the SliceManager. By doing this, Assistant will be able to access the slice we want to display.

In the SliceProvider, we must override two methods:

  • onCreateSliceProvider() which returns true to let know that the provider is successfully loaded. You should not perform heavy operations in this method or the application startup time will increase.
  • onBindSlice() is there to create the slice with the data you want to display. No heavy or network operations will be allowed and Slice should be returned as quickly as possible to have a responsive UI. You can perform IO operation outside of this methods with a LiveData or something else and once the data is loaded you can refresh the Slice by calling notifyChange(sliceUri, null) on the contentResolver.

The KTX version of the library provides very nice DSL methods (list, header, row…) to build your Slice UI (before going with Compose? We’ll see 😃). Here, we used a list to display a header with an action to open the app and several rows to display the to-do items.

We are now ready to display the user’s to-do list. After setting all tools, the card you can see below will be display in Assistant. We can interact with each item to call someone or to archive the call by example.

Improvements with custom intents

To improve the voice experience with Google Assistant, we’ll take a look at the custom intents to have a better matching on the voice queries. You can build your own intent and provide a set of strings that matches more precisely the action you want to trigger.

Now let’s see how we can set the tools and enable Assistant to test our actions.

Testing your actions

Now that we are all set on the code side, let’s see how we can test our actions with Google Assistant using Android Studio or our own voice. And the testing part is probably the most painful here.

First, you must download the App Actions Test Tool plugin in Android Studio to test and validate your App Actions integration. Restart the IDE and you are good to go!

App Actions Test Tool plugin for Android Studio

You can now launch the App Action plugin. Go to Tools -> App Actions -> App Actions Test Tool. A popup dialog will open where you will have to put the invocation name for your app. This might be the app name or another keyword, here we are going to use “rainbow”. You can also set a locale depending on the language you target.

To create the preview and start testing your actions, you have to follow some mandatory steps (and that’s why the testing party might be a bit painful):

  • You have to be logged on Android Studio with the same account as the Play Console.
  • On your device, you must be logged in with the same Google account as well.
  • Make sure you have enabled Google Assistant on your account. You might not have the right if you are dealing with a company account linked with Google Workspace. You should ask an admin to enable the feature.
  • Finally, we have to upload your app on the Play Console. If your app is not yet in production, you can create a new app and put it on the test track without release it. It will work for the tests.

⚠️ If you are using multiple flavors with different application ID suffix like .dev, .staging or whatever, you will have to upload each app on the Play Console to test your actions. No workaround found unfortunately 😞

Now that everything is ready and set, we can test the actions implemented in the application. You can run the queries directly from the App Actions Test Tool or you can use your voice with Google Assistant and say: “OK Google, show my to-do on rainbow” or “OK Google, call John Doe on rainbow”. It will open the app and call the contact or it will show your to-do list in Assistant depending on the query you triggered, thanks to the deeplink implementation.

That’s a wrap! We had a lot of fun building voice interactions for our application during this hackathon. We didn’t know anything about App Actions or Android Slices at the beginning, but we quickly learned how to deal with them to add cool voice commands.

On the other hand, the testing part might be a bit painful as you have to upload the app on the Play Store which is quite inconvenient if you are working with multiple application ID (dev, staging…). And beware of your accent! We had so much trouble making Google Assistant understand our Frenchy “aircall” pronunciation 😆 On this part Siri is much better than Assistant.

During the Google I/O 2021, Google announced the App Action v2 with the shortcuts functionality and lots of new features! Have a look at the new announcements here:

Thanks for reading this article! A big shout out to Karan Dhillion for the review. If you have any questions related to App Actions or Android Slices, feel free to ping me here or on Twitter 🙂

--

--

Julien Salvi

Google Developer Expert for Android — Lead Android Engineer @ Aircall (Paris) — Startup way of life, beer lover and world traveler.