One in 3 popular cancer articles on social media contain potentially harmful misinformation, according to a report published in the Journal of the National Cancer Institute.
Data indicate that one-third of viral cancer articles on social media platforms such as Facebook from 2018 to 2019 contained potentially harmful misinformation, the impact of which on scientific belief, trust, and decision-making needs to be further assessed and understood, according to findings reported in the Journal of the National Cancer Institute.
Investigators assessed 200 articles, consisting of 50 of the most popular articles on 4 of the most common types of cancer—breast, prostate, colorectal, and lung cancer—on social media and found that 76.9% contained harmful misinformation. Articles containing misinformation were found to have higher median numbers of engagement than articles that were considered factual (Median interquartile range [IQR] = 2300 [range, 1200-4700] vs 1500 [810-4700]; P = .007).
“These data show that cancer information on social media is often inconsistent with expert opinion. This leaves patients in the confusing and uncomfortable position of determining the veracity of online information themselves or by talking to their physician. Most concerning, among the most popular articles on Facebook, articles containing misinformation and harmful information received statistically significantly more online engagement. This could result in a perpetuation of harmful misinformation, particularly within information silos curated for individuals susceptible to this influence,” the study’s authors wrote.
The internet notably boasts a bounty of health care misinformation, especially on social media where false information has the potential to go viral. This presents a threat to public health and works to hinder evidence-based medicine, harm relationships between patients and experts in the field, and increase the risk of death among patients. In order to prevent further harm, this hurdle in patient care needs to be addressed, the authors emphasized.
Investigators launched the trial due to the scant data on the quality of cancer treatment information that is currently available across social media and how this might negatively impact patients. As such, the report authors set out to examine the quality of cancer treatment information on social media and assess how potentially harmful it may be.
A web-scraping software was utilized in order to pull popular articles containing keywords on the most common cancer types listed above and included news articles and blogs posted across Facebook, Reddit, Twitter, or Pinterest that had been posted between January 2018 to December 2019. Engagements for the publications were defined as upvotes (Twitter and Pintrest), comments (Reddit and Facebook), and reactions and shares (Facebook).
In total, 2 panel members were selected from the National Comprehensive Cancer Network to act as content experts from each of the aforementioned tumor sites. The experts reviewed the claims that were made in each of the articles and assessed the publication’s factuality and credibility on social media. Misinformation was defined by summary scores of 6 or more and represented information that was a mixture of true and false, mostly false, or false. Additionally, the information’s potential to be harmful was ranked as being either probably harmful or certainly harmful.
The articles came from a variety of sources, specifically traditional news including online print articles and/or broadcast media (37.5%; n = 75), non-traditional news that was digital only (41.5%; n = 83), personal blogs (1%; n = 2), crowd funding websites (3%; n = 6), and medical journals (17%; n = 34).
Expert review indicated that 32.5% (n = 65; 95% CI, 0.50-0.77) of the articles contained misinformation that was most commonly identified as being misleading (28.8%), mischaracterized strength of evidence (27.7%), and unproven therapies (26.7%). Additionally, 30.5% of the information contained within the article was considered harmful and could possibly lead to harmful inaction (31.0%), economic harm (27.7%), harmful action (17.0%), and harmful interactions (16.2%).
The median number of engagements for these articles was 1900 (IQR, 941-4700) and the majority came from Facebook (96.7%). In particular, engagements on Reddit and Twitter had a statistically significant association with misinformation and harm (P <.05), while engagement on Pinterest was not associated with misinformation or harm (P >0.63).
“Further research is needed to address who is engaging with cancer misinformation, its impact on scientific belief, trust, and decision-making, and the role of physician-patient communication in correcting misinformation. These findings could help lay the groundwork for future patient specific tools and behavioral interventions to counter online cancer misinformation,” the authors concluded.
Johnson SB, Parsons M, Dorff T, et al. Cancer misinformation and harmful information on Facebook and other social media: a brief report. JNCI. 2021; djab141. doi:10.1093/jnci/djab141