Google launches 7 new ways to search

Clear Google search history

Google has just held its “Search On” event, and at the event it introduced new ways to search by using machine learning to learn what users want to find and deliver a search experience more suited to their interests.

Google unveiled the new features at its “Search On” event

The company launches these novelties to provide more intuitive ways for users to find the information they are looking for, trying to understand not only the language with which they express themselves but also allowing them to search using images or the things they see in their everyday world. These are the new features presented by Google:

-Multisearch: In the Google app, users will now be able to take a picture or use a screenshot and, in addition, add text, to perform searches. This means that, for example, they will be able to literally point to a certain point and take a picture and add a question about it. For example, imagine taking a picture of a store and putting into the Google app, “What time do you close?”

The multisearch is already available today in English, and Google assures that in the coming months it will be available in more than 70 languages.

-Multisearch near me: In the Google app, in addition to taking a photo or uploading a screenshot, you will be able to enter “near me” (a command widely used in Google searches) to find a favorite dish or a product in a local store. This feature will be rolled out in the United States this fall. Google has not confirmed when it will reach other markets.

-Translation Lens with Augmented Reality: Google has made improvements to the machine Learning systems in Google Lens and can now translate text from complex images more accurately. If the text is not properly displayed, Google uses optimized models, with AI-generated backgrounds, to identify the text quickly, in less than the blink of an eye. This experience will be launched before the end of the year.

-Shortcuts to useful tools. Google is adding shortcuts just below the search box to make it easier to find the most useful tools, such as performing searches with the camera. This will first come to the Google app for iOS in English in the U.S. and then roll out, though with no date determined.

-Easier to ask questions: When you start typing in the search box, Google will provide options to automatically complete your question. This will launch in English in the U.S. “in the coming months,” according to the company.

-More visual explorations: When you are doing a search, for example, for a city, Google will also offer you photos, videos and testimonials from people who have visited it, content creators who have shared their experiences on the web publicly. This will also come in the coming months and first in the United States, for those who use Google in English.

-Related searches: When you search for a certain term, Google will show relevant content from a variety of sources, regardless of the format it is in: text, images, video… and as you continue to scroll, it will show new ways to get inspiration with terms related to the search.

Click to rate this entry!
(Votes: 0 Average: 0)
Share!

Leave a Comment