Remember when you had to refer to an encyclopaedia to learn about a topic?
Or wait for the morning newspaper or primetime news bulletin to know what was happening in the world?
Or check out a physical restaurant guide to read menus, make reservations or read reviews?
If you were born post 1996, it’s likely all alien to you. And if you were born before that – you probably remember these from a time long gone.
The digital age we live in gives us access to information at our fingertips, fast. Whatever it is you need to know — what’s happening in Ukraine, the IPL match score, a Bollywood wedding, where to get the best alphonso mangoes — it’s all available at the click of a button in a search bar.
The Information Vortex
In the information age, Google is the undisputed keeper of everything we want and need to know. and Google has made it easier for us to find what we want. But with so much information being published and posted on the web every second, it’s also easy to get swept up in a sea of misinformation – which may be intentionally or unintentionally published (fake news is indeed real – that’s the irony).
The sheer volume of information available on the web — from established, centuries old news sources like the New York Times to new-age sites like Buzzfeed to even satirical sites like The Onion — is astounding.
Having the option to look at so many sources gives us a more well-rounded perspective and even opens us up to viewpoints we may not have considered.
However, the tsunami of information, the ease of access and the speed with which it can come at you also means it gets harder to discern what’s credible and what isn’t.
The good news is, Google’s always a step ahead to ensure a better and safer experience for users. And their enhanced new fact checking features are a step in the right direction, to ensure that we can better judge the information they show us in SERPs.
The Fight Against Misinformation: Partnerships and Investments
Google has long maintained and displayed when a story is developing (and that information related to it isn’t broadly available as yet) when news breaks or if a topic is trending. The search results automatically show users a notice, indicating that it may take time for more outlets to publish news or information on the topic and cite credible information.
Ever since the Trump era, fake news has become a matter of grave concern, only made worse by the pandemic.
The Russian invasion of Ukraine brought matters to a tipping point, where people claimed to see misleading or doctored reportage and data, particularly across social media platforms.
Late last month, Google CEO Sundar Pichai announced it would take it further, investing $10 million to fight misinformation related to the Ukraine war. Along with the investment, Google has launched partnerships with on-ground, region-specific think tanks to look into probable cases of misinformation, along with grants to support nonprofits and fact-checking networks at the grassroots level.
Getting Started: The “About This Result” Option
The “learn more about this result” or About This Result feature was highlighted last year along with Google’s MUM update. You might search for something and Google pulls up a result from a site you haven’t heard about before.
When you click on the three dots next to the search result, you uncover more about the result:
Who is this source?
How do they describe themselves and what category or publisher do they call into?
What are other folks on the world wide web saying about this source/result page?
Is there any other news or information related to this topic that could put it in context?
This gives users the option to learn more about a result and seek out further information or alternative sources.
(You can read more about Google’s MUM updates in a detailed blog post here.)
Evaluating Misinformation is Now Easier
The cosmetics of certain websites can make them seem credible and real when they are not. And when we are browsing through so many pages and links, it can be easy for unethical outlets to make you believe the misinformation they push out.
Starting with English-language searches in the United States, search results will have tips to help users evaluate information online. Google will give users nudges – letting them know that they can check if a source is trusted, whether they should come back later as the story develops further, and more.
Google has also created a brand new resource page on Google support to give more tips on how to navigate information and land on tested and credible sources.
The Highly-Cited Sources Label
Many times in journalism, a story breaks – from a lesser-outlet — and then takes a life of its own as it gains eyeballs, traction and gets picked up by bigger news sources. And much like a viral TikTok trend, it can be hard to go down the search engine rabbit hole and find the original article or report.
Google has now come up with a way to help users identify stories that have been highly-cited by other news organisations. This gives the user an easy way to find the original story, along with context and more helpful and relevant information linked to the story.
Verified by Fact-Checking Organisations
The death of legacy media and downsizing of newsrooms has led to the shuttering (in many places) of the all-important fact-checking desk.
But the art and science of fact-checking isn’t completely dead – and Google wants to give those voices their due. The new fact-check features give users the option to easily find information that has been verified by independent fact-checking organisations.
If you search for a topic that is trending or disputed, you might just see fact-check articles as part of your results.
These are small snippets to help you get context about the subject or claim. Also,
when you browse Google News on desktop, you can see recently fact-checked claims from independent publishers in your region, when related to the top stories of the day.
Google also has a tool called the Fact Check Explorer which allows users to search for a variety of different topics they may have questions about, gathered from thousands of publishers around the globe.
The Wrap: What It Means for You
If you are a journalist or reporter, the fact-check features will work to elevate your reporting, by pointing users to the source: you. The highly-cited sources label means that Google is in a better position to identify stories that have been cited by other news publications, leading the reader back to the original source.
If you are a publisher, then it means that in addition to SEO, page structure and other must-haves, you need to ensure that you are pulling from, and linking to, such trusted resources. And with these new and improved features, there’s no excuse for not researching well when you create a piece of content.
Watch: How Google’s Fact Check Features Help Spot Misinformation
For Curious Minds
The 'About This Result' feature provides crucial context directly within the search results page, helping you make an informed judgment on a source's reliability before you even click. It was introduced to combat the overwhelming volume of online information and empower users to distinguish between established reporting and potential misinformation.
This tool offers several layers of insight:
Source Description: It presents a brief summary of the publisher, often from a neutral source, explaining who they are and what they cover.
Wider Web Context: You can see what other reputable sites are saying about the source, providing an external perspective on its reputation.
Related Coverage: The feature links to other news or information on the same topic, allowing you to compare perspectives and verify claims across multiple outlets.
Using this feature helps you move beyond the headline and assess the substance behind the source. Explore how this fits into Google's broader strategy against misinformation by reading the complete analysis.
Google's $10 million investment signifies a strategic shift from passive alerts to active intervention in the fight against coordinated misinformation. While 'developing story' notices manage expectations, this financial commitment and collaboration with regional experts aims to dismantle disinformation campaigns at their source.
This approach combines financial power with localized expertise:
Direct Funding: It provides grants to nonprofits and fact-checking networks that possess the cultural and linguistic context to identify nuanced falsehoods.
Expert Collaboration: Partnering with on-ground think tanks allows Google to gain real-time insights that algorithms alone might miss, especially regarding doctored reportage.
Grassroots Support: The initiative strengthens the local information ecosystem, building a more resilient defense against future misinformation threats.
This model demonstrates a commitment to not just flagging, but actively countering, sophisticated disinformation. Learn more about the specific organizations involved in the full piece.
Using the 'About This Result' feature is a far more efficient method for source evaluation than manual research, as it centralizes critical information directly on the search page. It reduces the number of steps required to build a foundational understanding of a source, allowing for a quicker, yet still informed, decision.
When comparing the two approaches, consider these factors:
Speed: 'About This Result' delivers contextual information in seconds, whereas manual research can take several minutes of opening new tabs and performing separate searches.
Objectivity: The feature often pulls data from neutral third-party sources, offering a more balanced view than you might find on a publisher's own 'About Us' page.
Contextual Relevance: It provides links to other coverage on the same topic, a step that is often overlooked in manual verification but is vital for assessing a story's placement within the broader narrative.
For a fast and effective assessment, you should prioritize the source description and related news coverage. See how this tool fits within the larger MUM update in our detailed analysis.
The most common mistake is immediately clicking and accepting a sensational headline at face value without first vetting the source. This behavior is a primary driver for the spread of unintentional misinformation, as emotional reactions often bypass critical thinking.
The 'About This Result' feature provides a direct, low-friction solution to this problem by creating a 'pause' moment:
It encourages a habit of inquiry by placing a verification tool right next to the result, making it easy to check before you click.
It provides immediate context on whether the source is a known satirical site like The Onion, a partisan blog, or an established news organization.
By showing what other sources are saying, it helps you quickly determine if the headline is an outlier or part of a recognized news event.
This simple check can be the difference between being informed and being misled. Read on to understand the psychological triggers that make misinformation so effective.
For any research project, you can use Google’s built-in features to systematically filter out low-quality information and build a strong foundation of credible sources. This process moves from broad discovery to specific verification, ensuring your work is built on solid ground.
Follow these steps for a more rigorous research process:
Initial Search and Skim: Perform your initial search but before clicking, scan the results for familiar, reputable names like the New York Times or established academic institutions.
Verify Unfamiliar Sources: For any promising but unknown source, use the 'About This Result' feature by clicking the three dots next to the URL. Review the source description and context.
Cross-Reference Key Claims: Identify the core claims in an article and perform new searches using those specific claims in quotation marks to see how other outlets are reporting the same information.
This methodical vetting process ensures you are not only finding information but actively validating it. Dive deeper into advanced search operator techniques in the full article.
In response to the rapid spread of 'fake news', Google has implemented several features designed to contextualize information during volatile, breaking news cycles. These tools act as informational speed bumps, encouraging patience and critical consumption when facts are still emerging.
Key features include:
'Developing Story' Notices: When a topic is new and trending, Google automatically displays a prominent notice at the top of the search results. This banner explicitly states that information is changing quickly and may be unreliable.
Credibility Indicators: The system is designed to signal that it may take time for a range of credible sources to publish on the topic, implicitly guiding users toward more established outlets.
'About This Result' Integration: This feature becomes even more critical during breaking news, helping users instantly evaluate the new websites that often appear to cover emerging events.
These interventions aim to manage the information vacuum that misinformation thrives in. Learn how these features evolved from past events in the full article.
Google's strategy of combining powerful algorithms with human-led, on-the-ground partnerships suggests a significant evolution in the role of a search engine. This hybrid model indicates a move away from purely passive indexing toward a more active, curatorial role in fostering a trustworthy information environment.
This future role is defined by two key trends:
Algorithmic Sophistication: Updates like MUM aim to understand the nuances of information quality, not just keyword relevance.
Human-in-the-Loop Curation: The $10 million investment in fact-checking networks for the Ukraine conflict shows a recognition that algorithms cannot solve problems of intent and cultural context alone.
This blended approach means your future search results may be shaped not just by what is most popular, but by what is verified as most credible by a combination of technology and human experts. Consider the long-term implications of this curatorial shift by reading our full analysis.
Google addresses the challenge of satire by using contextual signals to help users recognize the publisher's intent, rather than simply flagging the content as 'false.' The 'About This Result' feature is a key tool in this process, providing the necessary background to distinguish parody from malicious misinformation.
When you use the feature on a satirical site, you will typically find:
A Clear Source Description: The panel will often describe the site as a satirical publication, using information from objective third-party sources. For example, it would explicitly label The Onion as a humor or parody outlet.
Revealing Web Mentions: The section on what others on the web say about the source will likely contain reviews and articles discussing its satirical nature.
This approach focuses on educating the user about the source's purpose, empowering them to interpret the content correctly. Learn more about how search algorithms handle different types of content in the full article.
The core principle of 'About This Result', which is providing easy access to source context, has immense potential to be integrated across Google's ecosystem. Expanding this framework could create a universal standard for information literacy, helping users evaluate content regardless of the platform where they encounter it.
Future applications could include:
YouTube: A similar feature could appear next to videos, offering context on the channel's history, funding, or known biases, especially for news and documentary content.
Google News: Enhanced source context could be more deeply integrated, allowing users to filter their news feeds based on publisher type or verification status.
Chrome Browser: A browser-level extension could offer 'About This Site' information for any webpage a user visits, not just those found through search.
This would represent a shift toward embedding media literacy tools directly into the user's workflow. Explore the potential for these integrations in our full report.
Google's investment in sophisticated fact-checking tools marks a clear acknowledgment of its evolving public responsibility. Instead of simply organizing the world's information, the company is now actively building guardrails to help you navigate it safely and critically.
This shift in responsibility is evident in several ways:
Proactive Tools: The 'About This Result' feature is a proactive tool, offering context before a user commits to a potentially unreliable source.
Strategic Investments: The $10 million fund to combat Ukraine war misinformation shows a willingness to deploy resources to protect the integrity of information on critical topics.
Collaborative Approach: Partnering with nonprofits and think tanks shows an understanding that platform health is a shared responsibility that extends beyond its own corporate walls.
This approach reframes the search engine as a guide in the information age. Discover how this new philosophy impacts search rankings by reading the complete analysis.
The initiative to combat Ukraine war misinformation underscores a crucial lesson: technology alone is insufficient for defeating sophisticated, state-sponsored disinformation. Combining Google's algorithmic power with the nuanced understanding of on-the-ground human experts creates a much more resilient and effective defense.
This hybrid model has proven effective for several reasons:
Cultural Context: Human experts can identify misleading narratives, doctored translations, and culturally specific dog whistles that an algorithm might miss.
Speed and Agility: Local fact-checking networks can spot and debunk new falsehoods as they emerge at the grassroots level, feeding this intelligence back into the system.
Verification of Intent: While an algorithm can spot patterns, human analysts are better at discerning the intent behind a piece of content, whether it is satire, error, or malicious propaganda.
This approach serves as a powerful model for future content moderation challenges. Learn more about the specific think tanks involved in the full report.
Google's strategy is not to label sources as 'good' or 'bad', but to provide the necessary context for you to apply the appropriate critical lens for the content you are viewing. It recognizes that a source like the New York Times serves a different purpose than Buzzfeed or The Onion and equips you to make that distinction.
The 'About This Result' feature facilitates this by providing key differentiators:
Publisher Category: It clarifies if a source identifies as a news organization, a satirical outlet, a lifestyle publisher, or a scientific journal.
Source History: Information on when the source was established can help you gauge its track record and reputation over time.
Web Reputation: Seeing how others describe the source helps you understand its perceived role in the broader media landscape.
This empowers you to match your evaluation criteria to the content's intent. Read the full article to explore how this impacts your search experience.
Amol has helped catalyse business growth with his strategic & data-driven methodologies. With a decade of experience in the field of marketing, he has donned multiple hats, from channel optimization, data analytics and creative brand positioning to growth engineering and sales.