Contributors:
Amol Ghemud Published: October 12, 2021
Summary
In This Article
Share On:
At its annual event Search On 2026 that took place less than two weeks ago, Google announced the rollout of a new update to its search engine that uses AI to explore topics: Google MUM, or Multitask Unified Model. It is an algorithm that can search the internet across different languages, using text, images and media to help users find answers to detailed questions.
MUM brings solutions based not just on text, but on multimedia such as images, videos, and podcasts. Since MUM can simultaneously understand information across a wide range of formats – text, image and video – it is truly one of the first AI models to solve complex multimodal queries thrown up by users.
The focus on multimedia means bigger (and better!) change for consumers and retail – because now, consumers can search for products using images – without even knowing the name of the product or brand.
The web gives us an endless sea of options to choose from – and MUM makes choosing easier. While choice is empowering, it can be difficult to know where to start. And usually, it starts with what we see. And Google MUM has you covered – because now, you can actually shop for anything you see online, or in the physical world.
Google has divided the shopping experience into 3 phases:
Inspiration
Exploration
Purchase
and makes use of MUM to make each stage swift and easy.
Inspiration: Point, Ask, Shop
Photos become stoppable items. With the new update, you can shop in the moment with Google Lens. Users will be able to snap a photo and ask a question using Google Lens.
MUM will allow Google Lens to combine text and visual input to understand exactly what you want. Fire up Google Lens to take a screenshot of a product – then add text to describe what you are actually looking for.
Perhaps your friend is wearing a t-shirt with an image of a donut. You love the print – but want a similar print on a pair of pyjamas. You can take a picture of your friend’s t-shirt, add it to Google Lens, then refine that search by defining your intent – pyjamas in the same/similar print. And you will be able to find donut-print pyjamas from businesses online and near you.
Soon, you will be able to scroll through a website – and the Google Lens button on the Google app will make each product on the page searchable and stoppable. You don’t have to leave the website. (Much like you’re able to shop without leaving Instagram when you see an ad.)
With the Chrome update that will be rolled out soon, you can select an image or a part of the page you are browsing and quickly see search results on the right-hand side.
I see it, I like it, I want it, I got it isn’t just for Ariana Grande – anyone can literally search for what they see, on camera, because MUM makes it easier for Google to understand the intent behind your query and connects you to a retailer.
Exploration: Window Shop from Google Search
You have a vague idea of what you need but don’t know exactly which product or brand to buy. Now, you can window shop from your favourite brands and retailers on Google, right from the search results page. Search for “hoodies,” and explore a visual feed of shoppable options from well-known brands, local shops and new brands. It is an entirely different layout that combines SERP and GMB.
You will also see recommendations from magazines, blogs, videos and more. When you click on an image of a product you like, you can get more info, compare prices and even snag the best deal. This is powered by Google’s shopping graph, a data set of 24 billion product offers on the web. (The window shop feature is only available for mobile, focuses on transactional and product intent keywords.)
Purchase: Inventory Check
The pandemic saw quite a bit of panic buying, with people showing up to Costco and mom-and-pop shops to find shelves clean – no goods in stock. While those situations are unique, there are plenty of times where you might have shown up at a local bakery or store to find that the product you want is sold out or out of stock. And you go back empty-handed. Not anymore. With MUM, GMBP can work to show you local stores that carry the products you want, right from Search.
All you have to do is use the “in stock” filter to see only the nearby stores with a specific item on their shelves. You can also see the products they have in other categories as well. For small businesses, this means you can build a better relationship with their customers and gain an edge over e-commerce retailers who may take a day or more to deliver.
In Conclusion
MUM helps you shop with more choice than ever from merchants large and small – whole continuing to deliver the highest quality info and new tools to evaluate the credibility of what you find online. Google has made it easier at each stage, whether it’s Inspiration, Exploration or Purchase.
So if you own an e-commerce store, small business or are listed on GMBP – be sure to improve the quality and scale of your media content, improve/integrate schema markup and adapt to the new GMBP requirements, so your customers get what they are looking for from you, and not your competitors!
Watch: Google Search — Smarter Shopping with the MUM Update
For Curious Minds
Google MUM represents a significant leap from previous search algorithms by simultaneously interpreting information across various formats like text, images, and video in a single model. This allows it to grasp context and nuance in a way that feels more intuitive, directly answering complex, conversational questions rather than just matching keywords. For e-commerce, this is a pivotal shift because it aligns search with how people naturally discover things: visually and contextually. This moves the search experience from a simple query-and-response to a guided exploration, which is vital for product discovery. Your brand can now be found not just by what it is called, but by what it looks like in context. Understanding this unified model is the first step to adapting your strategy and unlocking its full potential for your business.
Google's shopping graph is a dynamic, real-time dataset that connects products, brands, sellers, and inventory information from across the web. When a user searches for something like 'hoodies', Google MUM interprets the intent and uses the shopping graph to create a rich, visual feed of relevant products directly on the search results page. This integration transforms the SERP into a digital storefront, pulling from its massive catalog of 24 billion product offers. For your brand, this means that having structured, accurate, and visually appealing product data is more important than ever to appear in these curated, high-visibility placements. To learn how to optimize for this powerful feature, you must analyze its full capabilities and align your data accordingly.
A traditional text search typically serves bottom-of-funnel shoppers who already know what they want and use specific keywords. In contrast, a visual search via Google Lens and MUM is ideal for capturing top-of-funnel, inspiration-driven shoppers who may not know the product's name but are intrigued by its appearance. The key difference is the entry point: one is based on knowledge, the other on curiosity.
Text Search (Bottom Funnel): A user types 'blue running shoes size 10'. The intent is clear and transactional.
Visual Search (Top Funnel): A user sees a photo of a unique jacket, uses Lens, and asks for 'shirts with this style'. This is a discovery-oriented journey.
Visual search excels at creating demand and introducing your brand to new audiences, while text search is better for converting existing demand. A balanced strategy should account for both paths to purchase explored in the full article.
This example perfectly illustrates that Google MUM can move beyond literal, one-to-one matching. It understands the user does not want the exact item (the t-shirt) but rather an abstract concept (the print) applied to a different product category (pyjamas). This demonstrates an ability to interpret creative intent rather than just identifying objects. For niche retailers, this is a breakthrough. Your unique designs or patterns can now be discovered by consumers who see similar styles elsewhere, even on entirely different products. Previously, you were reliant on users knowing the exact keywords to find you. Now, you can attract customers through visual association, opening up a massive new channel for discovery based on aesthetic appeal alone. Exploring this further will reveal more about its potential.
The 'window shop' feature transforms the traditionally text-heavy search results page into an endlessly scrollable, image-led discovery feed. This directly mirrors the user experience on platforms where users browse visually and shop in the moment of inspiration. Google MUM powers this by understanding the exploratory intent behind broad queries like 'hoodies' and curating a visual catalog from its shopping graph. This shift indicates that Google is prioritizing a passive, discovery-based shopping model over an active, query-based one. Brands should learn that high-quality, lifestyle-oriented product imagery is no longer just for social media. It is now a critical asset for search engine visibility, as Google aims to capture user attention and time directly within its own ecosystem. The full article explores how to adapt your visual strategy.
Google MUM transforms the inspiration phase by making the world around you instantly shoppable, turning passive observation into active discovery. It closes the gap between seeing something you like and finding out where to buy it, which is the very essence of inspiration. The primary mechanism for this is the enhanced Google Lens functionality. For example:
A user sees a unique floral pattern on a lampshade. They can snap a photo, add the text 'on a dress', and immediately find apparel with similar prints.
While browsing a design blog, a user can activate Lens to make all images on the page searchable, identifying a specific piece of furniture without leaving the site.
Someone spots interesting sneakers on the street, takes a picture, and finds retailers selling them locally.
These examples show how MUM captures purchase intent at its earliest point, a moment that was previously lost to fleeting interest. You can learn more about optimizing for this stage in the full post.
To thrive in this new visual search environment, you must treat your product data as a primary driver of discovery. A proactive optimization strategy is essential for ensuring your products are correctly identified and suggested by Google MUM. Here is a focused plan:
Enhance Image Quality and Context: Use high-resolution images showing products from multiple angles and in real-world, lifestyle contexts. This provides Google with more data to understand the product's use, style, and features.
Implement Detailed Structured Data: Use schema markup for products, including attributes like color, material, pattern, and style. The more descriptive data you provide, the better MUM can match your item to a nuanced visual query.
Maintain Accurate Inventory Data: Since the update helps users check local stock, ensure your inventory feed, especially for Google Merchant Center, is always up-to-date.
Following these steps will significantly improve your visibility in the next generation of search detailed in our guide.
Since Google MUM can understand information across text, images, and video, your video content is now a prime asset for product discovery. Your strategy should focus on making the visual information within your videos as explicit and machine-readable as possible. To adapt, you should first ensure high-quality video production where products are clearly visible. Next, create detailed video descriptions, transcripts, and closed captions that explicitly mention the products shown and their key features. Finally, link to the exact product pages in your video descriptions and consider using YouTube's product tagging features. As MUM's ability to analyze video content deepens, this structured data will enable it to identify your product in a video and surface it in response to a related visual or text query, turning your content marketing into a direct sales channel.
The rise of Google MUM signals a fundamental pivot from keywords to concepts and context. Your future strategy must become 'omni-format', treating images, videos, and text as interconnected parts of a whole rather than separate channels. Marketing teams should begin shifting focus in several key areas. First, invest heavily in high-quality visual assets and ensure they are tagged with descriptive metadata. Second, build content around topics and user problems, not just keywords, as MUM is designed to answer complex questions. Finally, think about the entire consumer journey, from inspiration to purchase, and create content that supports each stage. This means your SEO will become less about ranking for specific terms and more about being the most relevant answer, regardless of the query format. Explore the full strategic shift in the main article.
This update significantly changes the landscape for publishers by making virtually any image on their site shoppable. It has the potential to either disintermediate traditional affiliate links or create a new, more integrated revenue stream. For example, a travel blogger's photo of a landmark could lead to users searching for the jacket they are wearing via Google Lens, bypassing any curated affiliate links. However, it also means that high-quality, inspirational content becomes even more valuable as a source of commerce. Publishers who embrace this by producing visually rich content that inspires shopping intent could benefit from new partnership models with retailers, as their sites become direct points of inspiration for Google's visual search engine. The full impact on affiliate models is a key topic worth exploring.
Google MUM effectively solves the 'search vocabulary gap' by removing the need for precise language. Instead of relying on the user's ability to describe an item, it allows them to use an image as the primary query. This shifts the burden of translation from the human to the AI, creating a more intuitive path to purchase. By combining a visual input (a photo of a shirt) with a simple text modifier ('pajamas'), the user can articulate a complex desire without knowing terms like 'geometric print' or 'art deco pattern'. This reduces the friction that leads to abandoned searches, connecting visually-inspired intent directly with relevant products and helping businesses like yours capture customers who previously would have given up. This capability is a cornerstone of the new search experience discussed in depth in the article.
The 'window shop' feature tackles the problem of vague user intent by transforming a broad search like 'fall jackets' into a curated, visual browsing experience. Previously, only the largest brands with massive SEO budgets could dominate the first page for such terms. Now, Google MUM creates a more level playing field. It curates a visual feed that includes options from well-known brands, local shops, and new brands, prioritizing visual relevance and data quality. This is a solution for emerging retailers because it provides a direct path to get in front of customers during the critical exploration phase of their shopping journey. By ensuring your products are in the shopping graph with high-quality images and data, you can appear alongside established competitors. Learn more about this opportunity in the complete analysis.
Amol has helped catalyse business growth with his strategic & data-driven methodologies. With a decade of experience in the field of marketing, he has donned multiple hats, from channel optimization, data analytics and creative brand positioning to growth engineering and sales.