In this project, I explore what it means to be a debunker and how we can effectively debunk conspiracy theories.
“Misinformation can be sticky—even when it seems to have been corrected.” (The Debunking Handbook 2020, p. 2)This warning from The Debunking Handbook (2020) captures the central challenge explored across the research: why false beliefs endure even after correction, and how communicators can most effectively counter them. The Handbook argues that misinformation persists because repetition and familiarity make falsehoods feel true, and that the best debunking replaces these with clear, factual, and emotionally resonant alternatives.
This claim is strongly supported by Schmid and Betsch’s (2019) experiments A, which show that confronting science denial directly—whether through factual evidence or by exposing rhetorical manipulation—reduces belief in misinformation. They found that
“Providing the facts about the topic or uncovering the rhetorical techniques typical for denialism had positive effects… [and] we found no evidence that rebutting science denialism backfires.” (p. 931)The Handbook mirrors this finding, insisting that silence only strengthens false narratives:
“Do not refrain from attempting to debunk… out of fear that doing so will backfire.” (The Debunking Handbook 2020, p. 4)Together, they confirm that effective debunking depends not on avoidance but on clarity—naming the myth, explaining its fallacy, and reinforcing accurate information.
Yet evidence shows that simply giving people facts is not enough. As Sinatra and Lombardi (2020) B explain, in a post-truth era “we’re not living through a crisis about what is true, we’re living through a crisis about how we know whether something is true” (p. 121). They call for teaching audiences to evaluate the plausibility of scientific claims, not just the credibility of sources—a recommendation that parallels the Handbook’s instruction to
“Encourage the audience to evaluate the plausibility of the misinformation in light of alternatives.” (The Debunking Handbook 2020, p. 5)This shared focus on plausibility highlights a deeper truth: correcting misinformation means guiding people through the reasoning process, not merely presenting them with facts.
Krishna’s (2018) study C deepens this insight by exploring how misinformation becomes emotionally and cognitively entrenched. Using the Situational Theory of Problem Solving, she found that “acceptance of scientifically inaccurate data about vaccines… lowers institutional trust and influences individuals’ active communication behaviors” (p. 1089). The Handbook identifies this same mechanism, warning that “perceived trustworthiness of a debunking source may matter more than its perceived expertise” (p. 6). Both stress that debunkers must not only supply truth but also rebuild trust and empathy—showing that misinformation thrives when audiences feel alienated from institutions.
Van Duyn’s (2018) ethnographic work D on hidden online communities of dissent underscores why trust and openness matter. She reveals how fear of judgment can push people with minority opinions to express themselves only in secret, writing that “rural residents may organize and express their opinions in secret” (p. 965). The Handbook cautions against this “spiral of silence,” in which the majority’s reluctance to correct misinformation allows a small but vocal minority to dominate the conversation (p. 7). Both Van Duyn’s findings and the Handbook’s advice point to the same conclusion: visible, credible, and compassionate communicators are essential to countering false narratives.
Taken together, these studies show that effective debunking is not merely an act of correction—it is a form of relationship-building. Facts must be framed within empathy, explanation, and trust. As the Handbook concludes,
“Successful debunking can affect behaviour—for example, it can reduce people’s willingness to share misleading content online.” (p. 8)Evidence from all four papers confirms that when we teach plausibility, expose manipulation, and speak publicly with compassion, we make it harder for misinformation to take root and easier for truth to spread.
Analysis of the chosen debunker, Abbey Sharp (Food Science Babe), focusing on her "slow-melting ice cream is toxic" post.
Instagram, @abbeyskitchen, July 2, 2025
Abbey Sharp, registered dietician, uses Instagram Reels to debunk conspiracy theories about food and dieting. One video that caught my eye was her debunking the claim that slow-melting ice cream is toxic. Her video follows the format of myth–question–evidence-based discussion–fact, which is a little different from the Debunking Handbook’s format of fact-myth-fallacy-fact.
Myth – Question – Evidence-Based Discussion – Fact
Abbey begins by showing a clip of the myth in the bottom corner of the screen, while she fills in the rest of the screen with her face and reactions to the clip. While the myth clip plays, her face reflects feelings of incredulousness as she eats the very Drumstick ice cream cone that the conspiracy theory is bashing. When the clip finishes, she makes a quick joke about more Drumsticks for her, before posing the question “Should you be scared of ice cream that doesn’t melt?”. She then explains how ice cream is made and the types of ingredients used by ice cream manufacturers to keep their products fluffy and emulsified. She continues to explain that these additives are not harmful to our bodies and are used in Drumsticks to help the ice cream retain its shape, ultimately leading to a longer melt time. After explaining the evidence, Abbey ends with the fact that the meltability of ice cream does not signify healthiness.
Viewers are forced to watch the myth while also seeing Abbey’s facial expressions of disbelief.
I generally enjoyed Abbey’s debunking structure. Beginning with the conspiracy theory clip in the corner of the video was particularly clever. I like that she shows viewers the myth but does not let them watch the clip on their own. Instead, viewers are forced to watch the myth while also seeing Abbey’s facial expressions of disbelief. This helps ensure that viewers who only watch the debunking video do not fall for the myth. Abbey’s in-depth discussion about how ice cream is made was also useful in disproving the conspiracy theory. I like that she provides hard evidence and further-emphasizes her points by flashing diagrams, ingredient lists, and photos. This adds to her credibility. Finally, she does a good job of closing by tying all the evidence together and presenting the final debunking fact in a clear manner.
I think her video could have been improved by beginning with the fact or immediately following up with the fact after showing the conspiracy clip.
My only critique of Abbey’s debunking video is the lack of fact reiteration. While she implies that slow-melting ice cream is not a sign of toxicity or unhealthiness throughout her video, she does not explicitly state this until the end. I think her video could have been improved by beginning with the fact or immediately following up with the fact after showing the conspiracy clip. This way, viewers who only watch a few seconds of her video would still hear the fact said out loud. That being voted, I think that Abbey does a great job of successfully debunking the slow-melting toxic ice cream myth through a snappy and friendly video that engages viewers.
To become a debunker, I chose to emulate the Instagram Reel format of debunking videos. The interactive tool below directly addresses the **Illusory Truth Effect** by presenting **10 Common Pharmacy Myths** as True/False questions and immediately correcting the misinformation with the scientific truth and a pharmacist tip.
The best way to counter the Illusory Truth Effect is to reinforce the truth immediately after presenting the myth.
**Instructions:** Read the statement below. Is the common belief a **TRUE FACT** or a **FALSE MYTH**?
You scored out of 10.
Thank you for testing your pharmacy knowledge! Correcting misinformation helps everyone stay safe.