WorryFree Computers   »   [go: up one dir, main page]

Google Rolls out New Accessibility Feature That Lets You Talk by Staring At Emojis

In addition, Google Maps and Lookout will better describe objects in your environment.

We may earn a commission from links on this page.
Users can set what each of these emojis is supposed to relay using generated speech.
Gif: Google

Sure, Google I/O was one massive AI-palooza focused mostly on Gemini. But for Global Accessibility Awareness Day, there are a few more AI-enhanced vision features coming to some accessibility features coming to the Look to Speak app that will let users talk by staring at pictures, symbols, and–yes–even emojis, which could offer a new avenue for those with language issues or literacy challenges to communicate in their everyday life.

The Look to Speak app has been around since 2020, but for those who don’t know, it’s essentially an eye-tracking app that lets users select from a number of phrases and words that the phone then speaks aloud. In a blog post, Google described how the existing Look to Speak app now has updates that let users choose from emojis instead. Users will be able to personalize which symbols or emojis they want to use to fine-tune these interactions. It’s a pretty useful feature that makes an existing accessibility app even more accessible, even if it’s for those who don’t speak English.

Advertisement

Along with that update, there are more features coming to a few more Google apps that will make it easier for low-vision users to find objects in their environment. The existing Lookout app should now have the ability to find what you’re looking for in a room by hovering your phone’s camera in the object’s vague direction.

The app now has access to the beta Find mode. Essentially, you choose an item from among seven categories, including “Seating & Tables,” “Doors & Windows,” or “Bathroom.” After you select the category, you can move the phone around the room, and it should use the camera to pick out the direction and distance you are from the object or door. Lookout will also generate AI-written descriptions of images to learn more about photos you take directly in the app. Google clarified with Gizmodo that Lookout uses a visual model developed by Google DeepMind, the same group working on Project Astra’s AI digital assistant with similar vision capabilities.

Advertisement

Google Maps is also getting a new update that should feature the Lens in Maps descriptions in all supported languages. These added languages should work with the updated voice guidance and screen Lens in the Maps screen reader, which the company added to Search and Chrome in October.

Late last year, Google added wheelchair icons to identify ramps or accessible spots in Google Maps. It was previously restricted to Android and iOS, but now it will also be available for those browsing Maps on desktop. In addition, you can now filter reviews to find wheelchair accessibility while using the mobile app. It takes a fair deal of users to identify all the accessible locations. Still, the Mountain View company says that thanks to Maps users, there’s accessibility information available for more than 50 million locations. Google is also allowing businesses to list if they’re using Auracast for deaf or limited-hearing patrons.

Advertisement

Last year’s big accessibility update included Project Gameface, a new feature that could use facial expressions to control characters on-screen. The code was restricted to PCs when it was first released. Now, the feature should be available for Android developers as well as for mobile apps.


Want more of Gizmodo’s consumer electronics picks? Check out our guides to the best laptops, best TVs, and best headphones. If you want to learn about the next big thing, see our guide to everything we know about the iPhone 16.