At its Search On Livestream event, Google shared how they are bringing the latest in AI to its products, giving people new ways to search and explore information in more natural and intuitive ways. Google’s new AI wants to supercharge contextualized search results.
Earlier this year at Google I/O, it announced its reach to a critical milestone for understanding information with the Multitask Unified Model, or MUM for short. Google has been experimenting with using MUM’s capabilities to make its products more helpful and enable entirely new ways to search.
With this new capability introduced by Google, you can tap on the Lens icon when you’re looking at a picture of a shirt, and ask Google to find you the same pattern but on another article of clothing, like socks. This helps when you’re looking for something that might be difficult to describe accurately with words alone. You could type “white floral Victorian socks,” but you might not find the exact pattern you’re looking for. By combining images and text into a single query, Google is making it easier to search visually and express your questions in more natural ways.
Helping You Explore with a Redesigned Search Page
Google also announced how they are applying for AI advances like MUM to redesign Google Search. These new features are the latest steps Google is taking to make searching more natural and intuitive.
- Firstly, Google is making it easier to explore and understand new topics with “Things to know.” For example, you want to decorate your apartment, and you’re interested in learning more about creating acrylic paintings. If you search for “acrylic painting,” Google understands how people typically explore this topic and shows the aspects people are likely to look at first. For example, Google can identify more than 350 topics related to acrylic painting, and help you find the right path to take.
- Secondly, Google is making it easy to zoom in and out of a topic with new features to refine and broaden searches. In this case, you can learn more about specific techniques, like puddle pouring, or art classes you can take. You can also broaden your search to see other related topics, like other painting methods and famous painters. These features will launch in the coming months.
- Thirdly, Google is making it easier to find visual inspiration with a newly designed, browsable results page. If puddle pouring caught your eye, just search for “pour painting ideas” to see a visually rich page full of ideas from across the web, with articles, images, videos, and more that you can easily scroll through.
Google already uses advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe.
Using MUM, you can even see related topics that aren’t explicitly mentioned in the video, based on Google’s advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” Google’s systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators. The first version of this feature will be rolled out in the coming weeks.
A More Usefull Google
The updates Google announced don’t end with MUM, though. AI Google is also making it easier to shop from the widest range of merchants, big and small, no matter what you’re looking for. Google is helping people better evaluate the credibility of information they find online. Plus, for the moments that matter most, it is finding new ways to help people get access to information and insights.
All this work not only helps people around the world but creators, publishers and businesses as well. Every day, Google sends visitors to well over 100 million different websites, and every month, it connects people with more than 120 million businesses that don’t have websites, by enabling phone calls, driving directions, and local foot traffic.