By Susana Galeano
LATAM Content Coordinator
ANDn the history of mankind, we have never had as much information as we do now, all just a click away. If you are hungry, want to find a gift or are planning your next vacation, you only need to take out your phone and type a few words to find what you want. If this seems easy to you, Google's AI will make it even easier, as it has advanced so much that in a short time you will only need to take a photo and capture a few keywords to find practically anything you need in its great search engine. All this with the help of its Multitask Unified Model, better known as MUM, which is here to revolutionize visual search.
MUM's new visual search capabilities
MUM was unveiled during Google I/O 2021 as a new model that is 1,000 times more powerful than its Bert neural network and can combine multiple searches in multiple languages to deliver the most comprehensive answer to the user.
Mum is multi-model, meaning it understands information bahrain business email database from text and images , and it promises to expand its capabilities to video and audio formats in the near future. An example query that Google shares is “I have climbed Mount Adams, and now I want to climb Mount Fuji next fall. What should I change in my preparation?”
Google has been experimenting with using MUM capabilities and during its Search On event, the internet giant shared the latest advances in its artificial intelligence applied to its search products, which provide information in a more natural and intuitive way.
“In the coming months, we’ll be introducing a new way to search visually, with the ability to ask questions about what you see . Here are a couple of examples of what will be possible with MUM,” Google says in its statement, and shares this example: You have the perfect shirt, but you’d like to find socks with the same pattern. Now you can tap the Lens icon when doing a visual search for the shirt and ask Google to find that same pattern, but in socks.