Accessibility links

Advances in cognitive computing are challenging the traditional logic of devolution, marketisation and privatisation

Has Margaret Thatcher's favourite philosopher finally met his match? Friedrich Hayek famously argued that centralised planning was doomed to fail because a bureaucracy could never gather and interpret sufficient information to make sensible decisions. If attempted, the outcome would always be waste and resource misallocation on a grand scale. Instead, wherever possible the invisible hand of the market should be preferred to the dead hand of the state. 

Hayek won a Nobel Prize for his efforts, but more importantly his thinking went on to influence generations of economists, politicians and policymakers right up to the present day. In the fifty years since he first developed his ideas the policies of devolution, marketisation and privatisation have been embraced by governments around the world, as well as promoted by international bodies such as the International Monetary Fund and the World Bank. 

But Hayek is wrong. 

Interestingly, he wasn't always wrong but something dramatic has happened in the last few years that should give his proponents pause for thought. The rise of big data and advanced analytics now means that Hayek's fundamental contention, that information is overwhelming, is increasingly challengeable and the implications for policymakers will be profound. 

How centralisation can be a winning strategy

A revolution powered by big data and cognitive computing has been heralded in the private sector for several years. Companies like Google and Facebook are famously founded on business models that require the massive collection and interpretation of data at a scale simply unimaginable to Hayek but more traditional industries are being disrupted too. 

For example, real estate companies that oversee large national portfolios of properties in the US have moved away from a decentralised approach in which local managers set rents to centralised datasets and price setting software. The centralised approach consistently outperforms the devolved solution. Similar examples can be found in industries ranging from logistics (where the routes of UPS drivers are determined by software rather than drivers) to retail (where the staffing rotas and stocking schedules in supermarkets are determined by algorithms that predict customer demand). 

Situations that lend themselves particularly well to such an approach are likely to involve repeated decisions, based on diverse information, to achieve measurable outcomes. When this process is handled by a so-called learning algorithm, with the capacity to draw inferences between action and outcome, the result is turbo-charged learning at a level that rapidly outstrips human capabilities. See for example DeepMind's mastery of a classic Atari video game – from absolute beginner to world champion in a matter of hours. 

The public sector is awash with problems that are particularly amenable to cognitive computing given the complexity of public services and the social problems they attempt to address. For example, advanced traffic management systems are already used in countries like Japan to help reduce congestion by automatically controlling traffic lights and speed restrictions. Such applications are relatively uncontroversial but as technology advances it will be increasingly possible to apply similar approaches to interventions in health, education, welfare and justice. This isn't a question of 'if' but rather 'when'. 

The silicon state

The implications of all this for policymakers are many and varied. I'll focus on just three. 

Firstly, the assumption that markets and devolution are preferable to greater central control can be challenged afresh on practical and empirical grounds rather than relying on philosophical or political arguments. For example, it is no longer clear that multiple local commissioners are  better placed to make decisions about public services than a centralised agency tiny_Twitter Similarly, how certain are we that multiple outsourced providers are better placed to aggregate the knowledge required to improve public services than a single in-house provider? 

Secondly, the emphasis on data and outcomes in public services gains a new resonance. Data is the fuel upon which advanced analytics depends and higher quality, richer datasets lead to deeper insights and better decisions. We are no longer limited by the data-absorption capacity of humans or indeed the need for data to be presented in a structured manner tiny_Twitter

Policymakers should therefore redouble their efforts to ensure that as much information as possible about the inputs, activities and outcomes of public services is collected and shared in a way that unleashes innovation and service improvement. 

Thirdly, the moral and privacy issues raised are profound but unavoidable given the march of technology so we must start grappling with them now. Would it be acceptable for my unemployment benefits and sanctions to be determined by an algorithm predicting my propensity to work? Will I always be able to tick the box to say I can opt out and retain complete control of my personal data even if it means less effective, more costly public services? 

In his Nobel acceptance speech Hayek observed that if man is to avoid doing more harm than good he must learn that "he cannot acquire the full knowledge which would make mastery of events possible". The contrary is now increasingly true, and policymakers must quickly learn how to harness the power that such knowledge can offer us all.

 

Adrian Brown is Executive Director of the Centre for Public Impact

3 Comments

Join the discussion

Please login to post a comment or reply.

Don't have an account? Click here to register.