I just learned that the iPhone has been offering a feature I’ve long desired—the ability to identify plants and flowers from just pictures.
Third party apps have always been able to process images, but in previous experience are lacking in speed and accuracy. Google Lens is an app that can also do this, but since it is not my go to app its less convenient.
Apple has their own version of this visual search feature. It’s called Visual Look Up, and it’s pretty good.
The i icon is under photos inside your iPhone. If it has a little blue ring around it, then iOS has found something it can identify using machine learning. Tap on the icon, then click “Look Up”, and the app will try to give you some useful information relevant to what was in the photo.
If a ring with an “i” on it is present, there will be a Visual Look Up radius surrounding it. If you tap on the “i” icon, you will have the option to view more information about the object in the center of the photo. This information can come from a few different categories, including landmarks, plants and flowers, art and pets. Although this isn’t perfect, it has been reliable for me more than 50% of the time.
For those times when you are away from your computer, Visual Look Up can help you identify landmarks, animals, plants, and flowers. Apple announced this feature last year at WWDC but has not been trumpeting its availability.
Visual Look Up is only available in a few countries and languages, but the number of notifications has expanded since launch.
This is a great still, but what else do you think visual search could do? Imagine if Siri were able to help by recognizing your houseplant or give recommended hours and locations for a tourist spot.
Apple’s future products might give us virtual information similar to that on our smartphones. I hope for this level of complexity so these new devices will create a big reaction.
Got a Question?
Find us on Socials or Contact us and we’ll get back to you as soon as possible.