Google's new updates will allow Android phones to be controlled with just a smile
Tech giant Google is announced on Thursday that it will add new accessibility capabilities to make the Android operating system, which runs on billions of mobile phones worldwide, making it easier for people with severe motor and speech impairments to use their devices.
While there are multiple additions in the pipeline, the one that particularly makes a difference is the addition of gesture control. Google said people will be able to operate their devices hands-free by making simple facial gestures such as raising eyebrows, opening their mouth, or smiling.
The effort is being driven by a novel feature called “Camera Switches”. It uses the front camera of a smartphone and machine learning smarts to detect the facial movement of a user and execute the operation assigned to that movement. So, instead of lifting a finger or calling help, a differently abled person could just make the required facial gesture.
For instance, they could look left/right to navigate through an app and then upwards to select a particular task. The gesture and the action associated with it could be configured according to the user’s preference from accessibility settings. So, instead of looking up, a user could choose to raise their eyebrows for the action of selecting something.
Just two gestures – navigation and selection – will help users easily control music, videos, maps, books, news, calendar, and a lot more, the company claimed.
The features seem an extension of Google’s Project Soli, which the company used for gesture controls with the Pixel 4. While the tech giant hasn’t used the feature, called Motion Sense, in the Pixel 5, Google’s hardware chief Rick Osterloh told TheVerge last year that the feature will return sometime. Of course, the new accessibility features do not use the same hardware that was used for Project Soli.
In addition to Camera Switches, which is set to debut on September 30, Google is also bringing the ability to execute a bunch of pre-set actions through facial gestures.
The feature stems from Project Activate, a standalone app that scans pre-defined facial gestures and eye movements and then automatically triggers tasks such as playing an audio phrase or sending a text message.
It makes it easier for a person to communicate and express themselves in real-time but is only available in Australia, Britain, Canada, and the United States at present.
The announcement of the new accessibility features comes as Google gears up to roll out the final public version of Android 12. Typically, the platform is made available in September but this time Google is rumored to start the roll out in the first week of October.