LittleLaw Looks at… Misinformation and the Covid-19 Vaccine
January 16, 2021
8 min read
What’s going on here?
Since the Covid-19 outbreak began in December 2019 information about the virus has spread online at an unprecedented rate. However, not all of it has been reliable. The WHO believes we are in the middle of an “infodemic” which it defines as “an abundance of information, some accurate and some not, occurring during an epidemic.1 With such a vast volume of information available it is hard to distinguish between facts and outright misinformation. There are now widespread concerns that the prevalence of misinformation will dissuade people from receiving the recently approved Pfizer vaccine. The government is working with social media sites and considering fast tracking digital media laws to clamp down on harmful misinformation.
The impact of Covid-19 and vaccination conspiracies
In May, the BBC reported that “social media has been a fertile ground for [Covid-19] conspiracy theories”.2 Suggestions that 5G was the cause of the virus resulted in the vandalisation of 70 phone masts in the UK. However, baseless Covid-19 rumours have had much more dangerous consequences. It has been reported that in Iran 796 people died after drinking lethal amounts of alcohol as it was rumoured to kill the virus.3 The consumption of alcohol is illegal in the country, proving the extreme lengths citizens were willing to go to follow these rumours. A representative from Iran’s Legal Medical Organisation directly blamed online misinformation for the deaths; he believed they were a result of “fake news on social media”.4 These rumours have led some to take irrational and tragic actions to protect themselves against the virus.
Researchers at the Public Health Post have found a “direct correlation between exposure to poor information quality and poor health outcomes”. 5 Instead of looking at the content and integrity of online information, users are more likely to consider information with a greater number of likes. The Public Health Post study also found that people “with lower levels of e-health literacy” were unable to distinguish between fact and misinformation.6 Elderly and vulnerable people, who are at most risk to the virus, are likely to be the most susceptible to believing false information. The WHO is aware of the grave impact of misinformation and has labelled the issue an “infodemic”. It has warned that “fake news spreads faster and more easily than the [Covid-19] virus”.7 It is clear that, much like the virus itself, conspiracy theories surrounding the virus have the traction to spread quickly with deadly consequences.
These conspiracy theories have now moved beyond the origins of the virus and potential cures; there have been unproven rumours surrounding side effects of the vaccine and its safety. The hashtag “vaccines are dangerous” has been viewed 800,000 times on TikTok and is likely to influence the site’s young users who may not source news from other reputable sources.8 Vaccine hesitancy or the reluctance of the wider public to accept vaccination was one of the WHO’s top 10 threats to global health in 2019.9 Hesitancy is only likely to increase in light of the recent growth of anti-vaccination movements on social media.
Research has shown there is a real chance this misinformation may reduce vaccine uptake. A Cambridge University study has linked conspiracy theories to a decline in vaccine uptake and found that “a small increase in the perceived reliability of conspiracies [related] to a larger drop in the intention to get vaccinated”.10 Their study discovered that 22-23% of participants believed “Covid-19 was engineered in a Wuhan Lab”. Dr Sander van der Linden who ran the study stated that those who believed such theories are much more likely to be reluctant to receive the vaccine. Furthermore, The Vaccine Confidence Project conducted a study with 8000 people in the US and UK where only 54% of participants said they would “definitely” receive the Covid-19 vaccine.11 The link between conspiracy theories and vaccine reluctancy cannot be ignored. It is clear the vaccine is our primary chance at preventing further devastation to both the economy and citizens’ health. However, in order for the vaccine to be effective and herd immunity to be developed, 55% of the population needs to be vaccinated.12 Misinformation and conspiracy theories must be tackled in order to restore confidence in vaccination.
How have social media sites responded?
Disproving any misinformation which might dissuade the public from receiving the vaccine is key to ending the pandemic; this cannot be achieved without cooperation from the platforms on which rumours originate. The UK Government has now partnered with social media giants, including Facebook and Twitter, to tackle the prevalence of unreliable news and online anti-vaccination propaganda. On 8 November 2020 the sites agreed to a “package of measures” with the government which will ensure:
– “No user or company should directly profit from Covid-19 vaccine mis/disinformation
– A timely response to mis/disinformation content flagged to them by the government
– [Social media sites will] work with public health bodies to ensure that authoritative messages about vaccine safety reach as many people as possible”13
This agreement certainly seems like a positive step towards social media platforms taking accountability for the information that is shared across their sites and the persuasive power of influence posts can have. However, these developments have not gone without criticism. Despite Facebook removing over 12 million pieces of Covid-19 misinformation between March and October, they have stated they will not be able to implement the new procedures immediately and there may be some delay.14 It could be argued that enough people have already been strongly persuaded by vaccine conspiracies. The monitoring platform VineSight recorded a 200% increase in anti-vaccination content published on Twitter between March and October.15 Although the new government agreement requires sites to actively promote the health benefits of vaccination, it may be too late for the millions of people who have already been exposed to misinformation. Some Labour MPs believe the government should be doing more and are looking to legislators to tackle the issue.
What does the law say, can the Online Harms Bill help?
There have been calls by Labour MPs, such as Jo Stevens, for emergency laws to be passed to hold social media outlets legally accountable if they do not regulate harmful misinformation. In November, Labour called for “financial and criminal penalties for social media firms that do not remove false scare stories about vaccination”.16 These suggestions were not acknowledged by the government, perhaps a suggestion they feel their recently announced measures will suffice.
Although the government has not drafted emergency legislation in response to Covid-19 conspiracies, the Online Harms Bill demonstrates their awareness of the wider need to tackle online misinformation which has the capacity to impact public safety and security. The Bill was first announced in 2019 and will allow the government to hold social media firms responsible for any harm caused by information published on their platforms. The Online Harms White Paper sets out the aims of the proposed legislation which is to “establish a statutory duty of care to make companies more responsible for the safety of their users”.17 The new laws were originally proposed to tackle the spread of terrorism and child exploitation online. However, clauses such as section 1.23 have been added to cover “inaccurate information” such as “the spread of inaccurate anti-vaccination messaging online” which “poses a risk to public health”. It is clear these new measures would greatly help govern the spread of false information and protect the public from unreliable anti-vaccination information.
However, these laws are yet to be finalised and thus currently offer little help to the current struggle over damaging misinformation. Whilst in theory some parts of the Bill may offer support for tackling Covid-19 misinformation, its main aim is the protection of young people online. The government is therefore unlikely to rush through the lengthy process of approving the proposals as it is still debating the many other sections. It is estimated the Online Harms Bill may not be passed until 2023 or 2024.
LittleLaw’s Verdict: Will misinformation hinder vaccine uptake?
It is clear that the wide prevalence of Covid-19 misinformation has the potential to dissuade members of the public to receive the vaccine. Whilst Labour’s suggestion of emergency social media laws to clamp down on misinformation may seem like a positive action, there have been calls from journalists that such rushed social media laws may curtail the right to freedom of speech and expression.18 The government is therefore challenged with balancing the rights of citizens and public health to stop the spread of misinformation in time to promote vaccine uptake.
Those over 85, who are first to receive the vaccine, are not within social media sites’ main demographic of users so are unlikely to be dissuaded from vaccination by online misinformation. As the vaccine is being rolled out in stages, the government still has a limited amount of time to disprove misinformation before the majority of the public will have the opportunity to be vaccinated. The recently agreed measures with social media sites are certainly a step in the right direction to holding sites accountable for their content. They could, however, already be too late.
The government appears to be willing to go to great lengths to promote vaccine uptake and has plans to enlist celebrities and social media influencers to publicly endorse the vaccine. Ultimately, social media can be a useful tool for spreading information quickly. However, the prevalence of Covid-19 misinformation has proved it is often difficult for users to distinguish between fact and fiction. The government will be hoping its cooperation with social media platforms is enough to stamp out Covid-19 conspiracies for good.
Report written by Amber Allen
Share this now!
- Siddharth Venkataramakrishnan, “The real fake news about Covid-19” (Financial Times, 25 August 2020).
- Marianna Spring, “Coronavirus: The human cost of virus misinformation” (BBC News, 27 May 2020)
- Anjana Ahuja, “Health misinformation pollutes the web, with consequences for all” (Financial Times, 25 November 2020)
- Above n, 1.
- Jon Stone,“Coronavirus vaccine: Labour calls for emergency censorship laws for anti-vax content” (Independent 15 November 2020)
“Popular COVID-19 conspiracies linked to vaccine “hesitancy” (University of Cambridge 14 October 2020)
- Above n, 5.
- UK Government, “Social media giants agree package of measures with UK Government to tackle vaccine disinformation” (GOV.UK, 8 November 2020)
- Siddharth Venkataramakrishnan “Facebook vows to remove false claims about Covid-19” (Financial Times, December, 3 2020)
- Mariana Spring, “Covid-19: Stop anti-vaccination news online with new law says Labour” (BBC News, 15 November 2020)
Department for Digital Culture, Media and Sport Online Harms White Paper (Gov.UK, 12 February 2020)
- “Hastily introduced fake news laws could damage efforts to counter disinformation UNESCO reports warn” (The University of Sheffield, 5 May 2020)
Check out our recent reports!