Upholding democracy in an era of online manipulation - RSA

Upholding democracy in an era of online manipulation

Blog

  • Picture of Brhmie Balaram
    Brhmie Balaram
    Associate Director, Economy, Enterprise and Manufacturing (family leave)
  • Technology

As part of our Tech & Society programme, the RSA is setting out to make the case for experimenting with a new shared approach to stopping the spread of ‘toxic’ content online.

As more people turn to social media for news (a recent study found 62 percent of US adults cite social media as one of their news sources), these platforms have come to wield considerable power over the information we’re privy to, shaping our social and political views. In recent years, malicious actors have been exploiting the power of platforms by deliberately spreading disinformation, hate speech and extremist content across social media networks in an attempt to subvert democracy. These forms of content are ‘toxic’, seeping into newsfeeds with the intent to cloud and manipulate our judgment, increasingly about whom or what to align ourselves with or even to vote for. 

In recognition of this threat to democracy, policymakers have been applying pressure on platforms to more effectively moderate content. While platforms have made some strides, they’ve failed to contain the spread, in part due to the sheer scale of the challenge. For example, a single false story is able to reach 1,500 people six times quicker, on average, than a true story does, and there is a constant swell of posts inciting hatred and violence – Facebook removed more than 2.5 million such posts in the first quarter of 2018 alone. However, this is not enough to explain why the approach of platforms isn’t working; we need to keep in mind that:

  • Platforms primarily self-regulate by drafting and adapting content moderation policies in-house, employing moderators internally as well as outsourcing them as needed, and automating moderation when possible. Yet, policies are contested and moderation can be inconsistent given that it’s not always straightforward what constitutes disinformation, hate speech or extremism as opposed to an expression of free speech.
  • Many moderators are traumatised by the relentless exposure to toxic content, giving rise to concerns that this form of labour is inhumane and unsustainable.
  • There is big business in trafficking large volumes of toxic content, and incentives to churn out this content are high. Platforms are up against a global industry of automated networks, ‘fake news’ farms, PR firms, and political cyber-hackers that have yet to be systemically targeted.

Policymakers’ efforts to intervene and improve content moderation haven’t necessarily been successful either. Arguably, the current approach of policymakers, or regulators, also isn’t working well:

  • Political pressure has often taken the form of punitive measures, such as fines or suspensions. The threat of such measures has had unintended consequences, however, attracting criticism about state censorship and the stifling of debate. Some measures have already been overturned on these grounds, and more challenges are likely to follow.
  • This type of top-down approach strains the relationship between governments and platforms, and leads to platforms resisting the demands of the state. For example, we see this with Mark Zuckerberg’s refusal to give evidence before the international grand committee investigating the role of disinformation in elections. It also, ironically, gives away too much power to platforms as the only vehicle for tackling the problem.
  • Some governments (i.e. in Myanmar, the US) have been complicit in promoting toxic content through platforms if it serves their agenda, undermining the efforts of peers to crack down on such content.

While policymakers are right to challenge platforms on their role in spreading toxic content, we argue that current regulatory approaches require a rethink. The RSA’s research sets out to expand on an alternative, shared approach to strengthening democracy by mediating the flow of toxic content. In a new project, we will explore how platforms and policymakers can support one another’s aims in this regard, and how they can draw on a wider range of stakeholders – including civil society – in pursuit of a common agenda.

Be the first to write a comment

0 Comments

Please login to post a comment or reply

Don't have an account? Click here to register.

Related articles