Google Lens now can spot problematic skin spots, or not
It's not a doctor, it just plays one on the internet
Embracing the chatbot standard of unreliable information, Google has updated the Lens image recognition feature in its eponymous iOS and Android apps to possibly identify skin conditions.
"Describing an odd mole or rash on your skin can be hard to do with words alone," said Lou Wang, senior director of product management for search at Google, in a blog post on Wednesday.
"Fortunately, there’s a new way Lens can help, with the ability to search skin conditions that are visually similar to what you see on your skin. Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search.
"This feature also works if you're not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head."
Google Lens isn't sure either. It may return accurate information and it may not. Google is making no guarantees and includes a warning that whatever search results get returned from submitting an image for algorithmic scrutiny may not reflect actual afflictions, if any.
"Search results are informational only and not a diagnosis," the app says upon fetching similar dermatological imagery from the internet. "Consult your medical authority for advice."
Search results are informational only and not a diagnosis
But if you want to analyze a spot or blemish and receive a range of possibilities, Google Lens can do that. Or you could just ask whoever you find at the bus stop.
Your humble vulture consulted Lens on a definitely benign mole on his arm and one of several matching images returned was labeled "melanoma" – surely an example of what Alphabet CEO Sundar Pichai meant at Google IO last month when he talked about making AI more helpful for everyone.
But again, Google Lens comes with the caveat that it's not a doctor, it only plays one in software applications. What's one false positive or many if some people find they actually have a serious condition after seeking clarity about scary Lens output from someone who actually knows something?
The skin condition image search matching capability within Lens looks similar to an app announced at Google IO 2021 called DermAssist, though that app requires three input images before suggesting possible conditions. DermAssist is being market tested in Europe as a Class 1 Medical Device and also comes with a warning that it is "intended for informational purposes only and does not provide a medical diagnosis."
But there's more to Lens than non-diagnostic image matching of skin conditions you may not have to worry about. Wang talked up various other capabilities that can now be found in Google's image recognition software, including: animal, plant, and landmark identification; automatic translation overlays of street signs in foreign languages; a "homework help" feature by which Lens can offer problem solving advice based on a picture of an assignment; product suggestions based on image similarity; and a "near me" query for locating proximate food from a snapshot.
- Google crams more AI into search as Apple, Samsung sniff around Bing
- Google's AI search bot Bard makes $120b error on day one
- Google shows off upcoming AI search features, leaves Bard waiting in the wings
- IBM CEO explains why he offloaded Watson Health: Not enough domain expertise
He also teased the previously announced link between Lens and Bard, which will route images to Google's generative AI chatbot for explanatory contextual information that, again, won't be certified for accuracy but will mostly be good enough.
Welcome to the era of unreliable, disclaimed products, just as we're getting acclimated to social media misinformation for which platforms aren't really accountable. We have facial recognition systems that sometimes accurately identify people accused of crimes, software-piloted cars that mostly don't kill people, and Magic 8 Ball Brain Tumor Detection Edition – not really, we made that last one up. ®