During the initial discussion, the company said it’ll use machine learning to improve the app’s detection accuracy.
Apparently, Lookout can be used by everyone, be them visually-impaired or not. The app uses spoken words to alert you about the location of an object or a place. The app can even detect text in a publication or on a sign and voices out those words.
It looks like Google made some tweaks from the Google Lens, which features similar underlying technology, to make it work differently for Lookout.
It is said that Lookout is best utilized when your Android Phone is either placed inside a shirt pocket or worn around your neck, with the camera facing outward toward the world.
For the time being, users are encouraged to submit feedback to Google Disability Support Team and let them know about the app’s boons and banes.
Lookout is now only available on their namesake mobile brand, Google Pixel phones and is expected to be also available and released on more devices, countries, and platforms “soon”.