Why don’t politicians trust us with our data? - RSA

Why don’t politicians trust us with our data?

Blog

  • Digital
  • Technology

Director of Economy Asheem Singh explains why the RSA, ODI and Luminate have launched a campaign empower people to take back control of their data.

Who owns your data? You? The government? Mark Zuckerberg?

If you ask government officials or tech companies, every impression given is that these are questions too complicated for mere mortals like you and me to answer.

The public are fed simplified narratives about data that often mask dark misdeeds. It is suggested that data is like ‘oil’, when in fact data, unlike oil, can be replicated or duplicated at will. Politicians increasingly seem to believe that public forums are ‘fair game’ for data mining and collection. Whether or not you believe that certain political operators plan to turn free gov.uk sites into incredible data surveillance platforms that (dare I say it) win elections, the reality that we could countenance such a collective social hack says much about the times in which we live and the mistrust we have of those with power over data about us.

The conversation about our data – and about us – is increasingly polarised. How can we turn this around?

Tech and society intersect

When the RSA, the Open Data Institute and Luminate came together almost a year ago, it was suggested that the state of conversation about data was in such a poor place, and the public were so poorly informed about issues around data, that turning the ship around was an impossible task.

Worse than this, it was suggested that dark powers, from profit-hungry tech companies to shady alien others were bent on hijacking this ignorance to subvert our institutional balance. And there was not a thing we could do about it.

In this context, the RSA’s Tech and Society programme was formed. The programme seeks to understand how public institutions, companies, and we the people can engage with technologies such as artificial intelligence and machine learning in a way that places human agency, deliberation and flourishing first. Its method of change is to open up previously closed conversations and find innovative ways to get ordinary people talking about and taking control of the technological narrative. We don’t believe in technological fatalism – and neither should you.

In conjunction with our partners – both leading technological voices in the field – we conducted a research project that took the question of control, and access rights to data to the people. We conducted detailed evidence sessions with ordinary members of the public and sifted far and wide to understand the real picture behind public attitudes to data.

The results of this research are in our report  and show just how craven the narrative of a ‘know-nothing public’ actually is when it comes to data .

They also show how important it is that we now come together and put pressure on companies and political institutions, not to give us ‘back’ our data  for data cannot really be ‘owned’ – but to bring more of us into a nuanced and democratic conversation about the future of data about us.

#WeAreNotRobots

We are not robots. We do not churn out and blithely release data. We, the people, have a sense of understanding of what our data is and our relationship to it that is tellingly sophisticated.

That’s the takeaway from our research. The full results can be found in our report. But the key takeaways are that the public have a far greater knowledge of their data rights than is often assumed.

We want objectively reasonable things: continued and improved transparency and information about how, what and why data about us is being used.

We have significant and sophisticated concerns: how long data is stored for and whether individuals can access the full extent of data held on them. This has been prompted by the increased collection of behavioural data (such as likes and dislikes, desires and fears) rather than personal information, such as email addresses.

The public see legislation, moreover, as a potential ally. Following stories of bias affecting employment and policing decisions made with artificial intelligence, the public want legislation which will prevent prejudices being built into datasets and automated decision systems. Last month, the RSA called for legislation to tackle bias in artificial intelligence.

The public also want a greater use of opt-ins, feeling that many organisations make it highly difficult to remove themselves from databases and are highly opaque about what data is being collected.

In sum, the public want greater data rights, building on the existing provisions of GDPR – not paring them back as some commentators suggest. We want legislation to recognise the essential humanity of the data we produce.

We are not robots. That is the campaign we launch today. 

Data, democracy, shadow puppets – and the future

If you like what you’ve read and want to know more, please do explore the report and the animation. This campaign is just beginning. We’ll also be at party conferences – at a public event at Labour and a private roundtable at Conservatives to being. If you would be interested in an invitation to join the latter, please contact sarah.darrall@rsa.org.uk.

Beyond that, there’s an entire industry to democratise. Tech companies risk further alienating the public if they do not step up to these concerns, in a continuation of the ‘tech-lash’ that many of the big firms have faced since 2016. The public is concerned about the effects of tech on mental health and the impact of technology on children.

These concerns are not going away and over the next few weeks and months – and years  the RSA Tech and Society programme will be stepping up its work and it outputs to surface these thorny debates in the public consciousness. To find out more or stay in touch with the work of Tech and Society, please email me on asheem.singh@rsa.org.uk  

Download the report summary

Download the full report

 


 

About our methodology

The focus groups were conducted with the help of an independent research company, which recruited participants. The sessions were held at the weekend and participants were paid for their time in order to minimise barriers to participation. To ensure a diversity of views, the participants were selected to represent a range of ages, ethnicities, abilities and socioeconomic backgrounds. Participants were selected to ensure that the group had varied attitudes. The focus groups took place over a day in April 2019. In these, we tested people’s understanding of data and followed it with a workshop, which took place over a day in June 2019.

Be the first to write a comment

0 Comments

Please login to post a comment or reply

Don't have an account? Click here to register.