Google has announced the launch of Action Blocks, a new tool that allows people with cognitive disabilities build Google Assistant commands.
“Think about the last time you did something seemingly simple on your phone, like booking a rideshare. To do this, you had to unlock your phone, find the right app, and type in your pickup location,” Ajit Narayanan, staff software engineer, accessibility, wrote in a blog post. “The process required you to read and write, remember your selections, and focus for several minutes at a time. For the 630 million people in the world with some form of cognitive disability, it’s not that easy. So we’ve been experimenting with how the Assistant and Android can work together to reduce the complexity of these tasks for people with cognitive disabilities.”
He went on to explain how a Googler named Lorenzo Caggioni used the Assistant to create a device called DIVA for his brother, who is legally blind, deaf and has Down Syndrome. The device enables people with disabilities to interact with the Assistant in a nonverbal way. As Narayanan pointed out, DIVA was the starting point for Action Blocks, which allows a person to add Assistant commands to the home screen of their Android phone or tablet with a custom image that serves as a visual cue.
For example, an image of a cab can serve as the icon for a command such as ordering a rideshare.
“Action Blocks can be configured to do anything the Assistant can do, in just one tap: call a loved one, share your location, watch your favorite show, control the lights and more,” wrote Narayanan, who added that this is the first of many planned tools to help those with cognitive disabilities.
The introduction of Action Blocks, which is still in the testing phase, comes after the tech giant unveiled Parrotron in July, an ongoing research to help individuals with impaired or atypical speech be understood by both devices and people.