When in Nature, Google Lens Does What the Human Brain Can’t - Wired | iPads, MakerEd and More  in Education | Scoop.it
AI-POWERED VISUAL SEARCH tools, like Google Lens and Bing Visual Search, promise a new way to search the world—but most people still type into a search box rather than point their camera at something. We've gotten used to manually searching for things over the past 25 years or so that search engines have been at our fingertips. Also, not all objects are directly in front of us at the time we’re searching for information about them.

One area where I've found visual search useful is outside, in the natural world. I go for hikes frequently, a form of retreat from the constant digital interactions that fool me into thinking I’m living my “best life” online. Lately, I’ve gotten into the habit of using Google Lens to identify the things I’m seeing along the way. I point my phone's camera—in this case, an Android phone with Lens built into the Google Assistant app—at a tree or flower I don’t recognize. The app suggests what the object might be, like a modern-day version of the educational placards you see at landmarks and in museums.