Search
  • Skilled Creative Team

Feature Deep Dive: APL

Updated: Mar 3



The competition for the world’s best smart speaker has Google, Amazon, Samsung, and Apple sprinting at top speed towards retail shelves with new device models, features, updates, and acquisitions. Recently, Amazon took a leap forward when it announced a new design language that can help take Alexa skills to the next level.


APL IN A NUTSHELL

A few months ago, Amazon released its new design language for developers called Alexa Presentations Language (APL). APL is the first version of a dynamic visual design capability for voice assistant devices. With this new set of tools, Alexa developers can now use conditional layouts and data binding to build voice skills that incorporate not only sound, but also visual elements like videos, slideshows and images. In short, this language opens up a much more dynamic design capability for visual companion content in Voice experiences.


Amazon screen devices such as Echo Show, Echo Spot, Fire TV and select Fire Tablets are all fair game when it comes to this new design language. Non-Amazon bots are also invited to the party. Manufacturers who have built devices using the Alexa Smart Screen and TV Device SDK will have access to APL support.

WHY APL SHOULD BE ON YOUR RADAR

So why does this matter? The second most popular selling device this holiday season on Amazon was the fire cube TV (Apple TV killer?). This means everyone’s TVs not only have Alexa access now, but they also have APL access.


APL is a crucial development because it unveils a myriad of benefits in the world of voice technology. First, APL now gives developers the ability to be creative with visual designs on screens and move out of the voice tech dark ages. Similar to the early days of responsive websites, there used to be only a handful of templates for APL. This is the first chance to create eye-catching visuals that are not templated. Even though we’re in the confines of where APL is today, there are several ways to customize it.


Like websites, skills can now be adjusted based on screens users are on, and offer responsive, full-featured, and interactive displays. This is exciting because it allows brands to deliver the best possible user experience based on what device users are currently on. This development brings developers one step closer to easily designing multimodal experiences which are essentially experiences that can blend voice, text, images, video, audio and so on in one user interface.


Giving Alexa skills a visual boost is expected to improve user experiences by building in a second layer of useful info. For instance, sports fans watching a game on a fire cube connected TV could ask Alexa to pull up their fantasy team on the screen so they can see their data on a dashboard and make edits - all without lifting a finger to touch a phone or computer.


The experiences we depend on Alexa for daily can be enhanced dynamically. Weather reporting can give you personalized visuals that scale based on what screen you are on, or what types of information may be relevant to your area.


Visuals will likely lift voice commerce as well. Images and other visuals combined with voice unlock the ability to show consumers helpful information such as how products look and how they can be used. For example, cooking enthusiasts can bring dishes to life with APL. When they watch cooking shows they can pull up the active recipes and order the ingredients without having to put down their spatula and scroll through websites.

A DEEPER DEEP DIVE

In short APL’s magic is in container design structuring. When designing an APL experience, you establish “containers” or frames for where visuals will live. You can pre-determine where these containers will fit on the screen, what content goes into them, and when they emerge. You create a hierarchy for the placement of these visual containers based on several categories of IF-THEN statements. If you are on a very small device, then only place container #1. If you are on a very large device place containers 1-5, and place them in this order. If an external data point is relevant, or the user has gone down a specific conversational pathway, unlock and show additional containers.


SKILLED PREDICTIONS

Both Amazon and Google are betting big on visual demand for conversational experiences. With the release of APL, Amazon is giving development partners an edge on creating more exciting experiences, which will drive further discovery and adoption.


APL is the beginning of what will eventually look more like an HTML fully flexible experiences in conversational interfaces. Just as people have pushed the creative boundaries of websites, brands will be able to drive their unique messages with APL.

Questions? hi@skilledcreative.com

2 views

© 2020 SKILLED CREATIVE, LLC

All Rights Reserved

 Follow Us:

  • Grey Twitter Icon
  • Grey LinkedIn Icon
  • Grey Facebook Icon
  • Grey Instagram Icon