At Google Search On 2022 Event, major announcements were made with new features updated on searches such as Multisearch, Google Lens, Google Maps, and many more. But we have covered only 3 major announcements:
1. Multisearch near me feature to identify and find food near you
Google previewed multisearch near me at Google I/O the previous year. In the upcoming months, Google will roll out that feature in the English and American search results. You may zoom in on those picture and text searches using the near me feature to hunt for products or anything else using your camera, as well as to get local results. So, you can search for a restaurant that serves a particular meal.
Google Lens-powered google multisearch allows you to search using an image on your camera phone while simultaneously adding a text search on top of the image search. Then, based on your text and image searches, Google will show you visual search results.
More Articles:
- Quick Guidelines to Avoid the Most Common SEO Mistakes!
- The Importance of Implementing SEO In Your Fashion Business
- Moz Vs. Semrush: Which SEO Tool Is the Best?
- What are the Importance and Reasons for Cybersecurity for SEO?
- Google’s New Algorithm Update for Best SEO Practices
- SEO Techniques That Will Increase Your Website Traffic
What if, for instance, your friend posts a picture of what appears to be a Traditional Hamburger but you have no idea what it is? You can utilize the Google app rather than messaging your friend and waiting for a reply. It only takes a quick snapshot search to determine that the post is about black bean burgers.
After locating a restaurant that offers the dish you’re looking for, you should probably check out the menu to see if there is something on it for each member of your group. Google combines menu data from restaurant websites that adhere to open standards for data sharing as well as data provided by customers and businesses. Google uses cutting-edge image and language understanding technologies, such as Multitask Unified Model, to accomplish this. The most well-liked dishes will be highlighted on these menus, which will also helpfully list various dietary choices, beginning with vegetarian and vegan.
2. Now cleaner Google Lens text translation
When using Google Lens, you can use your camera or an image to look around you. Multisearch enables you to take a picture or use a screenshot and add text to it, much like how you might naturally point at something and ask about it. Now, Google Lens will present that translated text in a more polished and integrated manner. This will debut later this year.
GAN models—also referred to as generative adversarial networks—are being used by Google to improve the way the translated text is displayed. The “Magix Eraser” feature on photos uses the same technology as the Pixel devices.
3. Google Maps will soon receive new updates.
The capabilities of maps have always been expanded by Google Maps. Live traffic features have changed how individuals get from point A to point B and how they decide where to go. Soon you can check the ambiances before you go to a neighborhood and with the help of helpful photos and data from the Google Maps community right on the map, you’ll soon be able to choose and see the most popular spots come to life. This is Google’s new neighborhood vibe feature.
In order to create a more immersive map, Google has released over 250 photorealistic aerial views of famous landmarks around the world, ranging from the Acropolis to the Tokyo Tower. With the aid of predictive modeling, the immersive view uses historical trends to predict what a location will look like tomorrow, next week, or even next month. In the upcoming months, the immersive views will be made available on Android and iOS in Los Angeles, London, New York, San Francisco, and Tokyo.
In the upcoming months, Search with Live View will be made available on Android and iOS in London, Los Angeles, New York, San Francisco, Paris, and Tokyo.
Read more about this year’s Search On announcements
More Articles: