

According to reports from the Google-owned video platform which focuses on its medical policies is reportedly taking a new dimension. This comes as YouTube expands its guidelines with the latest which aims at banning content with vaccine misinformation, a broader step on its policies that previously helped in banning over a million videos spreading dangerous COVID-19 misinformation. However, in its latest development, the company says it will now take down content that spread misinformation about vaccine safety, ingredients in vaccines, and its efficacy.
It should be noted that the subject platform had previously banned misinformation specific to Covid-19 vaccines which are currently being strengthened by new updates, yet in a broader dimension that will also prevent misinformation about routine immunizations, such as the likes of measles and Hepatitis B coupled with general false claims about vaccines that are certified safe by local health authorities and the World Health Organization (WHO).
While in the meantime, it’s evident there are concerns over the slow rates of COVID-19 vaccination in the U.S., which if compared with countries like the United Kingdom and Canada that stands at 67% and 71% respectively of people fully vaccinated, whereas in the United States the percentage is much lower, which is at 55%. Although, President Biden has referred to social media platforms as an interface where vaccine misinformation spreads, in addition, Washington has enlisted the help of Olivia Rodrigo understood to be a rising superstar to help encourage Americans to get vaccinated.
Prior to this development, social media giants such as Twitter and Facebook expanded their respective policies. This saw the expansion of Facebook’s criteria that it uses to block false vaccine misinformation back in February. In the same vein, Twitter equally doubled down on its effort by labeling tweets that might mislead by using a combination of AI and human efforts as a way of curbing the spread of misleading COVID-19 misinformation.
Looking at YouTube expanded rules, it stated that content that violates its latest guidelines include videos that claim coronavirus vaccines cause chronic side-effects such as diabetes or cancer; videos that also claim vaccines contain devices that can track those who are inoculated; or video asserting that vaccines are strategically part of a depopulation agenda. Should any user post content that violates these guidelines, YouTube will remove the content and let the uploader understand why their videos were removed! Yet penalties differ, if it was the first time violating community guidelines, YouTube says such a user will likely get a warning without penalty. But if it’s a regular user, the user will receive a strike, and if a channel gets three strikes in 90 days, the channel is terminated.
According to YouTube’s blog post, it stated that “There are important exceptions to our new guidelines”. Adding that
“Given the importance of public discussion and debate to the scientific process, we will continue to allow content about vaccine policies, new vaccine trials and historical vaccine successes or failures on YouTube”.
This we understand will take time to “fully ramp up enforcement”, the company noted.
Be the first to comment