Meta’s Threads Blocking Covid-Related Content as Cases Rise

Share the joy

Meta’s Threads Blocking Searches for Covid and Vaccines 

Last week, Threads rolled out its search function. It was a part of the platform’s expansion and made it more like X. 

When users tried to search for covid and long covid, a blank screen appeared showing no search results. Then, a pop-up message that links to the CDC site appeared. 

In a statement to the Washington Post, Threads confirmed that it is indeed blocking those terms. However, it did not provide a list of blocked terms. But it is not just about COVID-related terms. Nude, porn, and sex are also among the terms that Threads blocked. 

Meta has good reasons to apply this rule on Threads. Instagram is known for spreading misinformation. When the pandemic started, Instagram was used to promote by people who were against vaccines to promote their conspiracy theories. In fact, vaccine and 5g were the top queries on the platform. 

Aggressive Approach

Blocking all searches that contain those sensitive keywords will tell you that Meta can now opt to block them even those terms that do not contain content that breaks the rule. It is an aggressive approach indeed. 

The Post also points out that users are not allowed to look for information, conversations, or resources that do not break the rules. It is indeed a barrier if you wish to seek advice or find credible details from experts. 

Social media platforms have implemented policies to block or restrict certain terms related to vaccines or COVID-19 for several reasons. Facebook, Instagram, X, and other major social media platforms have become hotbeds for the spread of misinformation and disinformation about vaccines and COVID-19. 

False claims and conspiracy theories can spread rapidly, leading to confusion, fear, and potentially harmful actions, such as people refusing vaccines or treatments based on false information. 

The misinformation spread through social media can have serious public health consequences, especially now that the COVID cases are increasing again. 

Inaccurate information about vaccines or COVID-19 can discourage people from getting vaccinated, which can lead to lower vaccination rates and increased vulnerability to the virus, potentially causing outbreaks. 

Some content shared on social media may promote unproven or unsafe treatments for COVID-19, putting people’s health at risk. Blocking or restricting such terms can prevent the dissemination of potentially harmful information. 

Responsibility of Social Media Platforms

Social media platforms have a responsibility to maintain the quality of information on their platforms. By blocking or flagging certain terms, they can better control the information and reduce the prevalence of false or misleading content. 

In some cases, governments and health authorities have mandated that social media platforms take action to curb the spread of vaccine-related misinformation. Platforms may block or restrict specific terms to comply with these regulations. 

By blocking certain terms, Threads can help users access accurate and up-to-date information about vaccines and COVID-19. Then again, the approach was a bit aggressive. 

It is not a perfect solution. Striking a balance between preventing the spread of harmful information and preserving freedom of speech is a challenging task for Threads and other social medial platforms.


Share the joy

Author: Jane Danes

Jane has a lifelong passion for writing. As a blogger, she loves writing breaking technology news and top headlines about gadgets, content marketing and online entrepreneurship and all things about social media. She also has a slight addiction to pizza and coffee.

Share This Post On