Education Research Current About VU Amsterdam NL
Login as
Prospective student Student Employee
Bachelor Master VU for Professionals
Exchange programme VU Amsterdam Summer School Honours programme VU-NT2 Semester in Amsterdam
PhD at VU Amsterdam Research highlights Prizes and distinctions
Research institutes Our scientists Research Impact Support Portal Creating impact
News Events calendar The power of connection
Israël and Palestinian regions Culture on campus
Practical matters Mission and core values Entrepreneurship on VU Campus
Organisation Partnerships Alumni University Library Working at VU Amsterdam
Sorry! De informatie die je zoekt, is enkel beschikbaar in het Engels.
This programme is saved in My Study Choice.
Something went wrong with processing the request.
Something went wrong with processing the request.

A Toolkit for Combating Online Misinformation

Share
3 July 2025
In this blog, Research Master student Deniz Sayın shares evidence based strategies such as correction, debunking, bypassing and truth boosting for effectively responding to online misinformation.

Misinformation is no longer the exception online. It is part of the everyday information landscape. From misleading headlines to manipulated videos, false information spreads faster than ever, outpacing verified facts and credible content (1). So, the question is no longer if we will encounter it, but how we should respond when we do. 

As a Research Master’s student in Social Sciences, part of my academic work focuses on how people engage with information in digital environments. This has led me to reflect not only on my own digital engagement, but more urgently, on a broader question. In a space where falsehoods flourish, what strategies could help us, students, educators, and active participants in today’s complex digital ecosystem, respond effectively? 

Encouragingly, recent interdisciplinary research, spanning psychology, communication, and information science, offers more than simple warnings or fact-checks. It points to diverse evidence-based strategies, such as debunking, bypassing, and amplifying truths. Here, I bring together insights from recent research to offer a toolkit of practical strategies for navigating and countering misinformation online.

Recognizing Misinformation is Step One! 

Before we can respond to misinformation, we have to recognize it. And that’s no easy task. On fast-paced platforms like social media, the line between credible and false information is often blurred. Spotting pseudoscience or deceptive sources requires more than good intentions. It demands digital literacy, critical thinking, and sometimes expert knowledge. This blog focuses on how to respond once misinformation appears in our feeds, but it rests on a key assumption: the ability to tell the difference in the first place. Without that foundation, even the best strategy might miss its mark.

Straightforward Responses to Misinformation: Correcting or Debunking It

The most common way to tackle misinformation is simple correction. Basically telling someone, “You’re wrong.” This can work well when corrections are timely, detailed, and clear. Interestingly, corrections can be especially effective when dealing with emotionally negative misinformation, like fear-inducing health myths. This is likely because discrediting these kinds of false claims helps reduce anxiety, making people more open to changing their beliefs (2).

But it has its limits.

When people are emotionally or ideologically tied to false beliefs, corrections can “backfire”. This counterintuitive phenomenon, called the backfire effect, occurs when attempts to correct misinformation strengthen a person's belief in it (3). Imagine trying to convince someone who believes that "COVID-19"vaccines contain microchips" that vaccines are perfectly safe. A direct correction can cause this person to cling to this false belief even more strongly. When correction hits a wall, debunking offers a more constructive way forward. Correction primarily negates misinformation. Debunking, on the other hand, involves refuting false claims and providing a coherent, truthful alternative explanation. Debunking does not stop at “You’re wrong.” It moves forward to say, “Here’s why that’s wrong, and here’s more accurate information you can trust instead.” So, its strength lies not only in challenging the falsehood but also in filling the informational void left behind by debunked misinformation. It doesn’t leave it unaddressed. This also matters because once people have thought deeply about or justified misinformation, simply stating that they are mistaken often fails. Debunking encourages critical thinking and helps people analyze claims themselves, rather than just accepting corrections passively.

However, the timing of intervention matters! People are generally more receptive to corrections and debunking before they become strongly attached to false beliefs. Early, thoughtful engagement can make a significant difference (2, 4).

Smart Detour: Bypassing It

But what if you could sidestep the confrontation and still change minds?

That’s the idea behind bypassing. Instead of directly refuting a false claim, bypassing involves presenting other truthful information that leads to a different, more accurate conclusion. So, it is less “You’re wrong,” and more “Here’s something better to believe.” For instance, if someone encounters a false claim that genetically modified (GM) foods are harmful, bypassing would involve sharing information about how GM crops can help save the bee population or combat world hunger, both positive and truthful narratives unrelated to the original myth (5).

This strategy is grounded in psychological models like the expectancy-value theory, which suggests our attitudes form based on a bundle of beliefs, not just one (6). So, rather than wrestling with a falsehood, 6 bypassing shifts the focus to alternative beliefs that nudge people toward healthier attitudes. 

Some recent studies have shown that bypassing is at least as effective, and sometimes more effective, than directly correcting misinformation (4). Again, it works particularly well when people haven’t yet formed strong attitudes about an issue, allowing the new, accurate information to shape their beliefs from the ground up. Additionally, it avoids the risk of reinforcing the original myth through repetition or triggering defensiveness in people who feel corrected.

Truth Boosting? Making the Accurate Viral

While stopping the spread of misinformation is vital, an equally important challenge is making sure accurate information gets the attention it deserves. That’s where a strategy, I refer to as truth boosting comes in. 

Rather than waiting to debunk false claims, this approach, based on the findings of recent research, focuses on amplifying high-quality, accurate content so it reaches and resonates with more people. The key to doing this effectively? Make it personal and social. 

A recent study shows that people are far more likely to share news articles when they perceive them as self-relevant (e.g., “This affects me personally”) or socially relevant (e.g., “This could help someone I know”) (7). In experiments, simply prompting readers to reflect on why a news story matters to them or their community significantly increased the chances they would share it, both on social media and in private conversations. 

Think of it like this. Instead of posting raw data about flu vaccines, someone might say, “Getting your flu shot this week could help protect your grandparents, coworkers, or classmates.” That shift, connecting the facts to people we care about, makes the message more compelling and more shareable. 

And it doesn’t just work in one place or on one topic. This strategy has been shown to work across cultures and issues, from climate change to public health! Neuroscience backs it up, too. When people think about how a story relates to themselves or others, brain regions tied to motivation and social reasoning light up (7). That’s the engine behind viral, prosocial sharing.

To put this into practice, we can frame information with relevance in mind, highlighting how a topic connects to people’s lives and communities. When sharing articles, a simple personal comment (“This matters to me because...”) can spark more engagement than the article alone. 

In today’s fast-moving information landscape, this approach can empower us to shape the narrative, not just clean up after it.

The Toolkit for Combating Misinformation: Which Strategy Works When?

1. Correction ("Well, actually…”)

    • Best Used When: 
      • The false information is fresh and not deeply held
      • The audience is open-minded or neutral
      • Clarity and credibility can be quickly established
      • Addressing emotionally negative misinformation
    • Less effective when:
      •  The person is emotionally or ideologically attached to the misinformation
      • It’s shared in a heated or polarized context
      • It’s done late (after the belief has solidified)

2. Debunking (“Here’s why that’s wrong, and what’s right instead.”)

  • Best Used When: 
      • There's time and space to explain
      • You can offer a clear, truthful alternative
      • You want to encourage critical thinking
    • Less effective when:
      •  The false claim is repeated too much in the process (risk of reinforcing it)
      • The audience is already defensive or disengaged
      • You skipped providing an alternative explanation

3. Bypassing(“Let’s change the topic... strategically”)

  • Best Used When: 
      • The audience isn’t emotionally invested yet
      • You want to avoid triggering defensiveness
      • You can steer the conversation toward relevant truths
    • Less effective when:
      • The false belief is already firmly rooted
      • The bypassed info feels disconnected or irrelevant
      • There's pressure to address the myth directly

4. Truth-Boosting (“Let’s make the truth go viral!”)

  • Best Used When: 
      • You want to proactively shape narratives
      • Accurate information is engaging, personal, or timely
      • You’re trying to build awareness or change attitudes broadly
    • Less effective when:
      • The efforts to share high-quality information are not sustained
      • The truth is dry, generic, or impersonal
      • The shared content lacks emotional or social relevance
      • It doesn’t reach the same audience as the misinformation

In today’s complex digital information ecosystem, no single strategy is enough. Correction, debunking, bypassing, and truth-boosting each offer unique strengths, but their power lies in using the right one at the right moment. Consider them not as isolated tactics, but as tools in a flexible, adaptable response kit. 

So the next time you encounter misleading content online, don’t just scroll past. Ask yourself: Should I challenge it? Reroute the conversation? Or elevate a better narrative? 

Whether you’re a student trying to make sense of your news feed, an educator aiming to foster critical thinking and digital literacy, or an active social media user who wants to shape a healthier information space, this toolkit is yours. Using it thoughtfully is not just about defending the truth. It’s about building a more trustworthy digital public sphere.

You can download the full blog post here.

Interested?

Contact the ReSCU Lab here!

resculab@vu.nl 

Contact

Quick links

Homepage Culture on campus VU Sports Centre Dashboard

Study

Academic calendar Study guide Timetable Canvas

Featured

VUfonds VU Magazine Ad Valvas Digital accessibility

About VU

Contact us Working at VU Amsterdam Faculties Divisions
Privacy Disclaimer Veiligheid Webcolofon Cookies Webarchief

Copyright © 2025 - Vrije Universiteit Amsterdam