Lies Infect Us All The Plague of Misinformation
With how widely available and accessible technology is in the modern age, these devices have allowed people to create and interact with other's content. As such, people have formed online communities about different topics, and health and wellness is no exception.
There are a few benefits of having health information easily available online.
- Medical professionals can inform the public on pertinent health topics.
- People can talk about their lived experiences with certain conditions or illnesses.
- Offer advice on how to treat certain injuries.
BUT
There is a ton of incorrect information plaguing these communities. Having improper health information could further harm people and even cause death.
Misinformation vs Disinformation
So Why Do We Fall For It?
There are plenty of reasons why we fall for misinformation. Several reasons include...
- General trust towards the person or institution spreading the misinformation. (Baker, 2022)
- The information given aligns with the viewpoint or framework a person has. (Polletta & Callahan, 2017)
- People only read the headlines and not the full article, making illogical conclusions. (Why We Fall for Fake News)
- If a piece of online content has more likes, a person is more likely to believe it's true. This phenomenon is prevalent in online echo chambers. (Why We Fall for Fake News)
- Constantly being exposed to misinformation reinforces it to be true. (Why We Fall for Fake News)
How Misinformation Spreads in Health And Wellness Communities
The Influencer
Social media has made it easier for common people to create their platforms and share their messages. And on these platforms, users can decide whose voice is worth listening too, democratizing the process. However, certain influences can use their social power to spread misinformation.
Common Methods
- Create a para-social relationship with their audience through relatability and showing insight on their lives.
- Distrust towards powerful institutions.
- "Exposing the truth." Common methods include using anecdotal evidence or misrepresent studies.
- Claim martyrdom when they get censored or kicked off the platform.
(Baker, 2022) and (Chou et al., 2018)
Algorithms
Platforms like YouTube, Facebook, and TikTok use algorithms to either cater content to a specific user or predict what they want. Social media platforms especially use engagement (likes, comments, shares) to determine what types of content to recommend. This system leads to echo chambers, where misinformation can be reinforced.
Even "neutral" search engines aren't immune to this bias. Google can give certain bad actors neutral sounding subtitles. For example, Milo Yiannopoulos is referred to as a "British Commentator." These subtitles hide and downplay these people's actions, and can introduce people to dangerous misinformation.
(Tang et al., 2021), (Al-Rawi et al., 2022), (Swire-Thompson & Lazer, 2020)
Ways to Combat Misinformation
CRAAP Test
- Currency: Was the information published or updated recently? Does this particular topic require constant updates?
- Relevance: Does the information given satisfy your needs? Who is the intended audience, and is the information presented at an appropriate reading level?
- Authority: Who is the author? What are their credentials or affiliations? Is there a way to contact that person or institution?
- Accuracy: Where does the information come from? Are the claims backed by evidence? Can other sources verify the information? Is the data presented in an unbiased and professional manner?
- Purpose: What is the intent on sharing this information? Are they trying to inform, entertain, or persuade their audience? Are the author's intentions clear? What are some biases that might be at play?
While the CRAAP test is an effective way to judge whether a source is trustworthy, it is an individual solution to a systemic problem. As stated before, people are unaware of their own biases and might be distrustful of certain sources. How do you persuade someone that a medical professional is trustworthy if they are convinced doctors work for big pharmaceutical companies to sell them fraudulent medications?
(Online Research: CRAAP Test)
Actions on Social Media
Doctors and other medical professionals have started their own social media profiles to ensure that accurate medical knowledge is being shared. By being on social media, medical professionals can have a better understanding on how online health discourse operates and come up with effective ways to spread their message.
Some social media companies have policies that result in post or certain accounts being removed if they are caught spreading dangerous information. Some social media accounts have taken further steps, such as Instagram banning certain hashtags or Twitter adding community notes under viral posts.
Several users have taken upon themselves to combat misinformation. These people infiltrate certain hashtags and keywords commonly used by conspiracy theorists to debunk or mock misinformation spread in those communities.
(Lundy, 2023), (Tang et al., 2021), (Swire-Thompson & Lazer, 2020)
Project
For my physical manifestation of misinformation, I created a box where a person can stick their hand into it and grab a piece of paper with a health fact on it. The catch is some of these facts or true and others are false.
The reason I made a box was to show how many algorithms are a "black box," meaning people know the inputs and outputs of a program but not the internal recommendation process. I made the box blue because that color is usually associated with health. I added flashy pipe cleaners and colored backgrounds on the hashtags to show how appealing this type of content can be. The hashtags I used were taken from a study that monitored COVID-19 and vaccine misinformation on TikTok.
I didn't make the information easily identifiable as true or false to show how easy it is to come across misinformation and for people to come to their own conclusions on whether to believe the given statements.
Sources
- Al-Rawi, A., Celestini, C., Stewart, N., & Worku, N. (2022). How Google Autocomplete Algorithms about Conspiracy Theorists Mislead the Public [Article]. M/C Journal, 25(1), N.PAG-N.PAG. https://doi.org/10.5204/mcj.2852
- Baker, S. A. (2022). Alt. Health Influencers: how wellness culture and web culture have been weaponised to promote conspiracy theories and far-right extremism during the COVID-19 pandemic. European Journal of Cultural Studies, 25(1), 3-24. https://doi.org/10.1177/13675494211062623
- Chou, W.-Y. S., Oh, A., & Klein, W. M. P. (2018). Addressing Health-Related Misinformation on Social Media. JAMA, 320(23), 2417-2418. https://doi.org/10.1001/jama.2018.16865
- Lundy, M. (2023). TikTok and COVID-19 Vaccine Misinformation: New Avenues for Misinformation Spread, Popular Infodemic Topics, and Dangerous Logical Fallacies [Report]. International journal of communication (Online), 17, 3364+. https://link.gale.com/apps/doc/A754393000/LitRC?u=unc_main&sid=summon&xid=a2343539
- Online Research: CRAAP Test. https://libguides.cmich.edu/web_research/craap
- Polletta, F., & Callahan, J. (2017). Deep stories, nostalgia narratives, and fake news: Storytelling in the Trump era. American Journal of Cultural Sociology, 5(3), 392-408. https://doi.org/10.1057/s41290-017-0037-7
- Swire-Thompson, B., & Lazer, D. (2020). Public Health and Online Misinformation: Challenges and Recommendations. Annual Review of Public Health, 41(1), 433-451. https://doi.org/10.1146/annurev-publhealth-040119-094127
- Tang, L., Fujimoto, K., Muhammad, A., Cunningham, R., Costantini, R. A., York, F., Xiong, G., Boom, J. A., & Cui, T. (2021). “Down the Rabbit Hole” of Vaccine Misinformation on YouTube: Network Exposure Study. Journal of Medical Internet Research, 23(1). https://doi.org/https://doi.org/10.2196/23262
- Why We Fall for Fake News. University of California Santa Barbara. Retrieved 10/13/2023 from https://cits.ucsb.edu/fake-news/why-we-fall
- Polletta, F., & Callahan, J. (2017). Deep stories, nostalgia narratives, and fake news: Storytelling in the Trump era. American Journal of Cultural Sociology, 5(3), 392-408. https://doi.org/10.1057/s41290-017-0037-7(Polletta & Callahan, 2017)