Facebook, YouTube and other sites are changing protocols regarding posts that contain information about misleading medical treatments.
For centuries people have sold various types of snake oil as a panacea for all kinds of medical conditions. Those so-called cures in a bottle made dubious claims about the efficacy of the treatment. Today, snake oil treatments are still sold but the marketing of these products have gone viral.
In films and literature, snake oil salesmen worked from a cart as they traveled from town to town, selling their products and then fleeing with the cash before people realized the so-called medicine didn’t work. Today, those salesmen use social media to promote treatments that are often billed as holistic and promise extraordinary results in a wide range of disease states, including cancer.
This morning, the Wall Street Journal pointed to the numerous posts circulating on social media sites like Twitter, YouTube and Facebook that contain “scientifically dubious and potentially harmful information” about alternative treatments for diseases and conditions. The owners of the platforms have become concerned about the claims made in these posts, particularly because they are not backed up by hard scientific data. Many of the claims made in the online posts are false or are based on inflated results from early laboratory studies. The Journal found that these posts are being viewed millions of times by users of the various platforms and that has led to a crackdown on such posts.
On Tuesday, Facebook issued a statement regarding the importance of minimizing health content that is “sensational or misleading.” The company made changes to its content management to reduce posts that have exaggerated or sensational health claims. Part of the problem is that people see these ads with misleading remedies and either forego the advice of their doctors or confront their doctors with inaccurate data.
“People come together on Facebook to talk about, advocate for, and connect around things like nutrition, fitness and health issues. But in order to help people get accurate health information and the support they need, it’s imperative that we minimize health content that is sensational or misleading,” Facebook said in its blog post.
Earlier this year, Facebook also tackled misinformation regarding vaccines and their efficacy. The social media giant said when it find ads that include misinformation about vaccinations, it will reject them. Such posts have, in part, helped to a growing mistrust of vaccines across the globe as some claims regarding safety spread like wildfire online.
The Journal noted that YouTube, which is owned by Alphabet (Google) had guidelines that do not allow videos that can cause harm, which includes videos with misleading medical information. Over the course of the first quarter of 2019, YouTube said it removed 8.3 million videos making such claims.