Hand holding Android phone with menu icons showing on screen

Blog

Your app is not alone.

Date

29th January 2020

Read

10 min

Creator

Joe Keilty

Leveraging Android’s openness to build task based flows across applications. 

According to Andy Rubin‚ a co-founder of Android‚ these few lines are what make Android an ‘open’ platform. These commands grant you access to the source code and allow you to build the operating system itself‚ providing you with everything you need to create custom variants of the Android stack‚ as well as porting devices and accessories to the platform.

Over the years we have seen parts of the core system locked down‚ whether via a move to proprietary software (e.g. Google Services)‚ or agreements not to fork Android‚ to ensure compatibility (e.g. The Open Handset Alliance). For many developers‚ however‚ access to the source code is not what we mean when we refer to Android as an open platform.

The openness within Android‚ for myself at least‚ comes with the unique ability to share data and create experiences across apps that are working together‚ as well as the provision of hooks into the Android system itself. These allow for the enhancement or even replacement of parts of the OS. Many of these are powered by the Android Intent system‚ which allows either explicit or implicit communication of abstract behaviours to be performed‚ known as intents. 

An explicit intent is most commonly used within a single application. For example‚ if the user presses a login button on an application‚ an explicit intent can be used to launch the next screen or activity. Implicit intents‚ on the other hand‚ can be a lot more powerful as they allow the Android system to evaluate the desired intent. This is what powers the unique ability for users to replace default applications on their system‚ such as their launcher (home screen)‚ web browser‚ email application‚ clock application‚ and more. Additionally‚ it also allows developers to build apps that work together‚ such that the user can move seamlessly between applications to achieve their goal. This is known as building task based flows.

Apps are often considered to be silos of functionality and information‚ kept separate from the rest of the system. On Android‚ this isn’t really the case. Apps are simply a collection of individual components that have been wired up to provide a cohesive experience‚ often through implicit intents. But apps don’t need to be defined solely by the functions they contain. They’re capable of calling out to other applications‚ and allowing other apps‚ or the system itself‚ to call functions within your own app‚ ultimately allowing developers to build richer user experiences.

[object Object]

A common example of this is the camera application. On Android‚ you are not limited to the camera app that comes pre-installed on the device. Instead‚ users can download alternatives from the Play Store and set them as the default camera app for the system.

When an application is being developed‚ there is often a requirement to allow users to take a photo – take‚ for instance‚ a social media application. In such an app‚ a user may land on a screen where they can type some text into a text field‚ and click a button if they wish to attach an image to their post. When this button is selected‚ the app will broadcast an implicit intent to the Android system that it wishes to take a photograph‚ and is then able to invoke a screen from the users preferred camera application‚ allowing the social platform to outsource the task of capturing a photo and returning that data (example below). This represents a fairly simple use case.

[object Object]

Building on that example‚ we can see how we could expose our app’s unique functionality to other apps and services‚ or begin chaining applications together. This level of exposure to the rest of the system isn’t just restricted to screens and functionality though – apps can also expose and share information through (permission restricted) content providers‚ or broadcast and respond to events across the system.

With content providers‚ apps are able to read information from other places on a device (for example‚ the user’s contact list or setting preferences)‚ which could be used to aid customisation of the experience your application offers users.

Elsewhere with events‚ the system broadcasts messages that apps can listen for and act upon. Similarly‚ apps broadcast interesting messages that other applications can respond to. To exemplify‚ an app can listen out for an event that indicates the device has connected to a specific bluetooth device‚ and launch into a car mode. Another app‚ meanwhile‚ may listen out for a music playback started event. At this point it can query the music being played‚ and provide the user with a link to the lyrics of the song.

These examples list just some of the features that can be used in order to build open app experiences on Android‚ and when properly utilised‚ can help us to build task based flows for our users. By this‚ we mean the flow a user takes when trying to achieve a task. 

To demonstrate‚ imagine a user is trying to explain something visual to a friend over an instant messaging client. The task that they want to achieve is to find an image that aids their explanation‚ The user also wants to annotate that image to add clarity before sharing it with their friend.

Conventionally‚ the flow for this task might involve the user opening up their web browser and performing an image search to find the appropriate image. They’d then save this to their device‚ return to their home screen‚ open a drawing app‚ import the image‚ perform their annotations‚ and save the image again. Finally‚ they’d go back to home‚ open the messaging client‚ begin the share flow with their friend‚ find the image on the system‚ and send it.

[object Object]

The above flow involves a lot of breaks in relation to what the user wanted to achieve. On several occasions‚ the user had to think about what app they needed to use to achieve the next step. The goal of task based flows is to remove these distractions‚ allowing the user to focus on the task at hand. Using the Android intent system as opposed to the above flow‚ a user can perform the image search within their browser to find the appropriate image‚ share the image directly with a drawing application‚ perform their annotations‚ and then share the image again with the messaging client to send it to their friend.

At any point in this flow‚ the user is able to use any application they have on their device that is capable of performing the desired task. At no point did the user have to copy or save anything‚ need to hit home and find the next app‚ nor encounter limits to sharing with externally approved messaging clients. In a sense‚ by exposing their functionality this way‚ the apps organise themselves around the task‚ allowing the user to seamlessly flow through while focusing solely on their goal and ultimately preventing distractions.

As developers‚ we should keep this in mind‚ not only in order to consider how we expose our application to the rest of the system‚ but also how we structure tasks within the apps themselves. We should be focused on exposing the functionality the app provides and building a flow that allows the user to achieve this task with minimal resistance. For instance‚ if we structure an application such that each activity (an Android component) focuses on the completion of a task‚ then we automatically get benefits such as a logical back stack‚ narrower navigation paths and clearer animations that inform the user when they are at the start and end of their specific task. Additionally‚ this ensures that if the application is extended at some point in the future to expose this task flow to other applications‚ we are able to simply expose the activities intent (providing dependencies are handled)‚ as opposed to restructuring the entire application.

At the time of writing this blog post‚ I was struggling to think about how to end a conversation about the openness of an open platform‚ and so naturally‚ I procrastinated. I did this for long enough that Google I/O 2018 began‚ and naturally absorbed my attention. There was a lot discussed at I/O but one thing that struck me was a new ‘Slices’ and ‘App Actions’ API. The developer documentation talks about app actions as: ‘a way to fill their [a user’s] needs – at the moment they need it the most’‚ and is powered by none other than the intent system.

App actions and slices can be registered by an application to handle one or more intents‚ and allow developers to associate capabilities and content of their application with the intent embedded in other parts of the system. These actions have been nicknamed visual intents‚ as they expose the intent handlers a developer includes within their application‚ in the form of a button or other UI component. It is not too different from the implicit intents discussed above‚ however instead of hard-coding the desired action (like a open camera button)‚ Google is using a dynamic‚ ever-growing list of ‘actions’‚ as well as utilising machine learning to ensure your functionality appears throughout the system. Included in these places are the Android launcher‚ smart text selection‚ Google Search‚ Google Assistant‚ the Play Store‚ and more.

[object Object]

The above image shows App Actions within Google Search to open the Fandango app to purchase Black Panther tickets or open the trailer for Black Panther in Youtube. 

[object Object]

The image here shows an action in the smart text selection to open the Spotify app on the Taylor Swift artist screen. 

Slices are more powerful than simple actions as they are customisable templates that allow developers to expose functionality of their app as a rich user experience within other applications. These templates aren’t just static screens of your application embedded into places like the Google Assistant‚ but can contain in line actions‚ deep links back to your application‚ scrolling content‚ and controls like toggles‚ sliders‚ and buttons. Additionally‚ your application can declare itself (or parts of itself) as a slice viewer‚ allowing you to leverage the functionality of other apps within your own.

Exposing these rich‚ interactive components empowers developers to improve user acquisition and increase retention in the long-term by surfacing their functionality at a relevant time. Here we can see again that the Android system is focusing on organising itself around the user’s goal as opposed to the user remembering what application can perform certain tasks‚ and is doing so in a way the provides a simple‚ intuitive‚ experience for the user.

The announcement of app actions and slices makes it clear that Google wants to encourage developers to expose the unique functionality of their application throughout the system. It also shows they believe that apps don’t exist in isolation but can be integrated to deliver tailored experiences focused around a user’s goal. To put it simply‚ your app is not alone.

[object Object]

A template of the Slices API‚ displaying an existing booking for an upcoming trip within the Google Search app.

[object Object]

Another Slice with interactive controls for requesting a ride via an Uber/Lyft/Taxi-style service.