Stopping COVID-19 Misinformation Is the Best New Year’s Resolution
As we begin a new year and head back (at least virtually) to work and school, we might be thinking about personal things we would like to improve. Some people resolve to exercise more, stick to a budget or cut out sugar from their diet. Others resolve to write that book, use social media less or volunteer in their communities. These are all great ideas, and I’d like to add another one.
Though we all made our New Year’s resolutions on Jan. 1, I respectfully suggest a January resolution that would, if we each committed to it, produce a large positive impact on society. This year, I resolve — and would like to encourage others to resolve — to stop the spread of misinformation at the individual level.
In 2020, misinformation was revealed to be a major issue impacting elections, world politics and our health during COVID-19. And with the arrival of a new calendar year, the problem has not gone away. In fact, with the COVID-19 vaccine rollout now picking up steam, we need to fight misinformation harder than ever before.
Misinformation and propaganda
Misinformation is not new, and propaganda has been part of political communication since the dawn of politics. While misinformation can be spread by government and corporate public relations officials, celebrities and international bad actors, it is enabled through our own social networks as we like and share information with others.
In other words, we can put a wrench in the works of those who are trying to sow deception and division by stopping a key flow of bad information.
But how can we stop the spread? To ensure we are not inadvertently sharing misinformation, we must first understand what drives us to share misinformation, so that we can identify our own triggers and resist them. None of us share information we think is false: we share information that seems true to us, and unwittingly spread misinformation in the process.
But wait, you say, I would never spread misinformation. I only spread true information. Unfortunately, the fact is that we all share information without checking it at least some of the time, which is why false information spreads online so much faster than the truth. Social media platforms are designed to increase our engagement and as such, they actually nudge us towards sharing without thinking too hard about what we’re spreading.
Sharing information is a social act
People have a wide variety of motivations for sharing information online. My team’s research on COVID-19 social media engagement shows that people will share information they think will help keep themselves and their loved ones safe. This is supported by law professor Tim Caulfield, who writes that our perception of risk is likely to drive engagement with misinformation.
Misinformation is more likely to spread when it’s novel or uniquely interesting. My own team’s ongoing research shows that people are more likely to trust information that they feel to be right, particularly if it’s delivered by people they perceive as experts.
What does this tell us about the individual’s role in sharing misinformation? Put simply, it shows that what causes us to share misinformation is a combination of factors: strong negative emotions like anxiety and perceived risk, social bonds between families, friends and loved ones across online and offline social networks, and feelings of correctness.
People share information they feel to be true because they’re worried and trying to keep loved ones safe. They share information delivered by people they trust — and sometimes those people aren’t actually experts in the field they are opining on.
How to halt misinformation
Understanding our own tendencies for sharing information and misinformation can help us stop the spread.
So how can knowing what motivates us to share content help us? You can short-circuit your automatic sharing tendencies and push back against the nudges from social media platforms to prevent the spread of misinformation to your own networks. It’s the same as making any change in your life: identify the triggers and change your behaviour.
This means that when the content makes you feel emotional — particularly if it makes you anxious — stop and think before you click.
If the content is particularly new, novel or strange, stop and think before you click.
If the content is something you want to share right away, because it has the perception of urgency about it (ACT NOW! WARNING!), stop and think before you click.
If the content would be particularly appealing to your social networks, stop and think before you click.
If the content is shared by a celebrity, or someone who is not actually an expert in the subject of the content, stop and think before you click.
And most importantly, if you are sharing content because deep in your heart and soul, you know it to be true; if you are sharing content that “just feels right” — I cannot stress this enough — stop and think before you click.
Moving beyond emotion
I know when I feel really emotional, I don’t always think clearly, and I know when I want to share information that appeals to my family and friends, I’m not always thinking about accuracy, so I try to be extra careful in those moments.
I recommend following the SIFT framework developed at Washington State University that tells people to stop, investigate the source, find trusted coverage and trace the claims back to the source. This means thinking like a detective (or an investigative journalist) and gathering evidence for the information you are sharing with others.
Besides following the SIFT framework, when I stop and think before I click, I like to ask critical questions of the content I’m about to share. I ask: “Why do I think this is true?” and “How emotional do I feel about this topic?” I also ask: “Where can I find more information?”, “Who does this information benefit?” and “What might be an alternative viewpoint I haven’t considered?”
I’m not perfect, and I’ll probably still share inaccurate information at times. That’s why for 2021, I resolve to double down on my efforts to stop the spread of misinformation.