In Internet discussion forums in the early 90s, Godwin saw debates frequently getting out of hand. Potentially fruitful conversations about difficult topics would escalate until predictably, someone made a comparison to the Nazis. Claiming the other side’s ideas were “Nazi-like” or they were “just like Hitler” ruined any hope for productive dialogue. So he set about engineering a counter-meme. Godwin’s law stated “as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one”. He seeded this law in discussions to show participants they were “acting as vectors to a particularly silly and offensive meme”. It worked: others started repeating the meme and mutating it, and making Nazi comparisons now acts more to weaken your own position, than that of your intended target. The consequences of this memetic experiment are still being felt today in ways Godwin could have never expected. Putin invoking “denazification” as his justification for invading Ukraine, utterly failed to land with western audiences, who grew up debunking lame attempts to label people as Nazis on the internet. In effect this is a form of banner blindness – the phenomenon where people tend to automatically ignore advertisements online – calling someone a Nazi in an attempt to discredit them is now completely inert as a strategy, and instead acts to discredit the accuser and make people question their motives.
Can we engineer counter-memes for other ‘mind viruses’? In studying the spread of misinformation, researchers have indeed found methods that echo Godwin’s law to be an effective way to inoculate people against fake news. In medical vaccines, a virus is weakened so it doesn’t make you sick, but will trigger antibodies to fight future infections. Meme vaccines work the same way. Forewarning that there are people intentionally trying to mislead on this issue, and presenting facts and arguments to refute the misinformation, was proven to diminish the effects of harmful memes. Prevention is better than cure: once someone is exposed, debunking has little effect, and fact-checking spreads slower on social media than misinformation. The pandemic playbook translates to memes: quarantine those spreading the virus (social media ban), protect the most vulnerable (public education) and treat the infected (mental health support). As Chris Wylie, whistle blower on Cambridge Analytica, prescribes in his book, “To make a population more resilient to extremism, for example, you would first identify which people are susceptible to weaponized messaging, determine the traits that make them vulnerable to the contagion narrative, and then target them with an inoculating counter-narrative in an effort to change their behaviour”.
This same process can be run in reverse to amplify disinformation, or abused to shut down real information that those in power don’t want to get out. In at least one of the disinformation studies done during the pandemic, they listed the “lab leak hypothesis” – that COVID-19 escaped from a lab – as disinformation they successfully ‘vaccinated’ people against. It was considered as such because Peter Daszak, President of EcoHealth Alliance labelled it as a ‘conspiracy theory’ in the Lancet, a respected medical journal. It later came to light that his organization had given The Wuhan Institute of Virology $600,000 to conduct “gain of function” research in Coronaviruses. Not evidence of a cover up, but enough of a conflict of interest that perhaps we were wrong to inoculate against it.
Analysis: Putin’s claim that war on Ukraine is to target Nazis is absurd. Here’s why
How To Inoculate Yourself Against A Weird Mind Virus
I Am Legend (2007) Will Smith: Robert Neville
If It Doesn't Spread, It's Dead (Part One): Media Viruses and Memes
Information Economics - The Market for Lemons
Inoculating Against Fake News About COVID-19
Mindf*ck: Inside Cambridge Analytica’s Plot to Break the World
Naming Is An Act of Creation
Snow Crash – Virus, Drug, or Religion?
Vaccinating against viruses of the mind