What’s Happening in Case of Covid-19 Misinformation?
Lately, there’s a lot of buzz about the spread of vaccine misinformation on Facebook and Instagram.
Facebook has announced that it had already removed millions of posts that contain containing false
information about Covid-19 and the vaccines on both platforms. The issue of concern is that the
spread of vaccine misinformation is affecting how people are reacting to the pandemic and the
There is a good chunk of people that’s not planning to get vaccinated and this is why misinformation
that’s been spread needs to be brought under control. This would further aid in people getting their
jabs. Only after vaccination is it possible to see a greater social mixing of groups. For this, curbing the
spread of misinformation is the need of the hour. Converting vaccine hesitancy to vaccine
willingness is what the priority is of the health system. Instagram, a photo-sharing app, has seen
various types of misinformation being spread right from the start of the pandemic. Since then, the
situation continued to worsen. Just type “vaccine” into Instagram’s search box and you see a huge
number of anti-vaccine accounts. These accounts have content full of misinformation and
conspiracy theories. The situation has now gone of control because you’ll find accounts that
specialize in collecting reports of people who get very sick or even dye after getting their vaccine. In
reality, the authorities could not establish that the vaccine caused the illness or death. Seeing all
this, people jump to conclusions that support what they already believe comes as no big surprise.
The United States would be seeing a vaccine confidence drive in the coming days. In response to
this, Facebook came up with a lot of measures to provide factual information.
Instagram was thought to be a platform to spread misinformation pertaining to viruses and the
vaccines associated. During the start of the pandemic, Instagram accounts of some anti-vaccine
influencers, such as Robert Kennedy Jr., saw rapid growth in the number of followers. All of this was
smooth because it was easy to form a connection with those who weren’t sure about the virus. The
thing about misinformation about vaccines and viruses is that just a hint of doubt about it is
sufficient to prevent them from getting people inoculated.
A study published by the misinformation watchdog group, Center for Countering Digital Hate (CCDH)
examined how often Instagram’s content suggestion algorithm offered posts containing
misinformation. Following this, the researchers came up with three groups to test the algorithm.
One of the three groups followed anti-vax, QAnon, health influencer, and white supremacy
accounts. The second group followed only accounts that have been certified as trustworthy
information sources. The last group followed a mix of anti-vax, QAnon, influencer, and official health
This is what the researchers found
● Instagram suggested those posts and accounts with misinformation to those test accounts that
already expressed an interest in anti-vaccine and QAnon content
● Instagram doesn’t suggest misinformation to the accounts that follow only official sources of
● If an Instagram account followed a health influencer who is related to the anti-vax movement in
one way or the other, the algorithm is such that it might suggest posts from more hard-core and
● If the algorithm finds the user to show interest in anti-vax content, it might suggest other types
of radicalized content as well. Some of them could be posts containing anti-Semitic conspiracy
What has Facebook done so far?
Since the start of the misinformation spread, Facebook has removed misleading information
about the vaccines on all its platforms. Robert Kennedy Jr. who’s an anti-vax (against the vaccine
deployment) has been removed from Instagram but not yet from Facebook as he has exceeded
the limit on Instagram but on Facebook so far. Just a month back,