Google Lookout: App Reads Grocery Labels For Blind People. According to BBC, Google’s AI can now identify food in the supermarket, in a move designed to help the visually impaired. The new feature has the ability for the computer voice to say aloud what food it thinks a person is holding based on its visual appearance. The feature also has the ability to distinguish between a can of corn and a can of green beans.
The tech giant says Lookout is also using image recognition to identify the product from its packaging. In a kitchen cupboards test, by a BBC reporter, the application had no difficulty in recognizing a popular brand of American or another Lookalike product from Thailand.
The UK’s RNIB (Royal National Institute of Blind) People gave a cautious welcome to the new feature.
Google Lookout: App Reads Grocery Labels For Blind People
In a move designed to help the visually impaired identify things around them, Google’s AI can now identify food in the supermarket. The new feature is embedded in the Google Lookout app, which aims is to help people with low or no vision identify things.
The new update includes the ability for computer voice to say aloud what food it thinks a person is holding based on its visual appearance.
The move was accepted by one of the UK blindness charity, saying it could help boost blind people’s independence. The feature will be able to distinguish a can of corn from a can of green beans.
Moreover, lots of Apps such as calorie trackers have long used product barcodes to identify what you’re eating. The Lookout app is also using image recognition to identify the product from its packaging, says Google.
A post on Google’s AI blog says, the app, for Android phones two million popular products in a database it stores on the phone and this catalog changes depending on where the user is in the world.
In accordance with the BBC reporter, in the kitchen cupboards test, the app had no difficulty in recognizing a popular brand in American hot sauce and other similar items from Thailand. Thus, it could also read spices, jars, and tins correctly from British supermarkets, as well as imported Australian favorite Vegemite.
Although, it fared less well on fresh produce or containers with irregular shapes such as onions, potatoes, tubes of tomato paste, and bags of flour. So, if it encounters trouble, the app’s voice asked the user to twist the package to another angle- but still failed on several items.
Royal National Institute of Blind (RNIB) Welcome The Feature
The UK RINB people gave a cautious welcome to the feature. Robin Spinks from Charity says “Food labels can be challenging for anyone with a visual impairment, as they are often designed to be eye-catching rather than easy to read. Ideally, we would like to see accessibility built into the design process for labels so that they are easier to navigate for partially sighted individuals.”
With other similar apps like “Be My Eyes and NaviLens, which are also available on iPhones- it can help boost independence for blind people by identifying items easily and quickly.
How Smartphones Became The Eyes For The Blind
The app makes use of similar tech to Google lens, it can identify what a smartphone camera is looking at and show the user more info. It already had a mode that would read any text it was pointed ate and an explore mode that identifies objects and text.
However, launching the app last year, the tech giant recommended placing a smartphone in a front shirt pocket or on a lanyard around the neck so the camera could identify things directly in Its front.
The update also comes with a scan document feature, that takes pictures of letters and other documents and sends it to a screen reader to be read aloud.