Apple’s Visual Look Up is modeled on Google Lens


Apple's Visual Look Up is modeled on Google Lens

At the WWDC event, which was streamed on June 7, Apple spoke about a new feature coming to the iPhone, iPad and Mac this year.

That feature is Visual search, which Apple announced uses artificial intelligence software to identify and classify objects found in photographs. It didn’t get huge attention during the streaming, and it appeared almost in hindsight in the shadow of Apple’s detailed presentation. Live text (which is another great feature by the way).
However, it certainly caught our eye, and probably the eyes of everyone who knows Android Google Lens. In fact, it seems to copy exactly what Lens does, even if it’s coming a little late in the game.

What Visual search is about to do, is to identify several three-dimensional elements stored in photos, and let you search for information about them by pressing a small interactive pop-up that appears on top. According to Apple, it helps you easily categorize, for example, a dog’s breed, flower family, the name of a particular landmark, and its geographical location, and so on.

Coverage Visual search After WWDC, Apple’s bulletin has a saving summary as follows:

Live streaming showed about the same amount of screenshots on a single screen, showing exactly what’s on the list.

How does it compare to Google Lenses?

Taking all this into account, Apple Visual Search Google Lens is barely as impressive or versatile as Google Lens, which came out in 2017 and has improved tremendously since then.

Google Lens also recognizes Dog Breeds, Plants and Landmarks – but that’s just the beginning of its features. Google has developed a lens for a place where it can scan and identify 3D shapes through the camera lens, and search for almost any product or object by searching for similar photos online and telling you where to buy it and how much.

In addition, Google Lens can extract words from photos and convert them to text by copying them directly to paste wherever you want. While all of these features are under the wide umbrella of Google Lens, Apple has also just introduced this photo caption feature and launches it separately with iOS 15 by calling it Live text.
With both Google Lens and Live Text technology, you can focus your phone’s camera on text that contains contact information – such as a business card – and it analyzes it instantly and prompts you to take action. For example, you can call immediately or send to the phone number shown in the frame, or send an email to an email address.

Google Lens also lets you translate text in real time, displayed in augmented reality on the screen as you scan your environment. Apple Live Text also supports translation, but it pauses the image and places it there – not in the Lens AR style.

Apple has unexpectedly been accused of copying Google in an area with artificial intelligence Visual search, but intelligent photo analysis technology was certainly to be achieved in their devices sooner or later. This cannot rightly be called plagiarism, but rather to follow in the footsteps of inevitable progress.
Google Lens was a feature that was originally designed just for Google’s own Pixel but has rapidly grown and spread over it. On the other hand, we have great doubts Visual search will never be available for Android. But that’s to be expected, because Apple has always strived to create a closed ecosystem whose components only work best with each other.

We expect Apple to do something special Visual search in the future, even if it has only a limited set of functions.

Leave a Reply

Your email address will not be published. Required fields are marked *