Over the last decade, many economists have debated fiercely whether we are in an age of permanent low growth, the so-called ‘great stagnation’. Innovation in the golden industrial age – encompassing, roughly, the century between 1870 and 1970 – had an unmatched (and unmatchable) ability to boost productivity, and hence support higher growth, according to proponents of this theory, such as Robert Gordon.
Source: Own calculations based on World Bank data and research by Mark Roser
Not quite according to scholars such as Erik Brynjolfsson, who believe that fourth Industrial Revolution (4IR) technologies such as AI, biotech, the Internet of Things and others, have huge revolutionary potential. This group reminds us that ground-breaking technologies of the past were not adopted productively at scale immediately. For example, the electrification process started in the late 19th century, but assembly lines only appeared in the early 20th century, and the multiplication of electric home appliances had to wait another several decades.
Hang on, it’s coming!
Just wait, the argument goes, and 4IR technologies will deliver qualitatively superior innovation, which will in turn underpin massive productivity gains and the associated improvement in standards of living. What is more, the pandemic has probably brought this moment closer by accelerating the digital revolution and prompting governments in advanced economies, especially the United States, to invest heavily in infrastructure and technology to support long-term growth.
It is a view to which I am sympathetic. Yes, expecting improved versions of the technologies that led to the productivity growth explosion in the industrial age to do the same trick again would be silly. But there is no reason why a new ‘innovation generation’ couldn’t do it for our exponential age (to borrow Azeem Azhar’s phrase).
This places the spotlight on the technologies that could act as productivity catalysers. The job can’t be done by technologies that ‘just’ do one thing well, no matter how well or how sophisticated they are.
We need new General Purpose Technologies (GPTs). A GPT is a breakthrough ‘platform’ innovation on top of which many other technologies can be developed. In the past, this was a role played by fossil fuels (from the late 18th century), rail transport (in the 19th century) and IT (and in the second half of the 20th century).
Digital twin ecosystems as a GPT
Enter digital twins (DTs). These aren’t technologies as such, but rather virtual, real-time counterparts of physical infrastructures or objects that can be used massively to improve the way these infrastructures or objects are managed. When a DT is in action, data flows from the physical asset to it, while interventions flow in the opposite direction.
For instance, an offshore oil rig’s DT (informed on an ongoing basis by sensors placed on the platform itself) will not only help its managers plan for different scenarios but also foresee potential problems and act to prevent them. For example, the DT will determine if a minor damage could evolve into a serious risk in the event of a severe storm and adopt the best solution to tackle it; say, generating a maintenance request or even shutting operations down.
So DTs are data ecosystems that act in and help us understand what goes on in the physical world. That in itself is great but not necessarily a GPT able to unleash a more innovative and productive economy.
The real game-changer could be DT ecosystems, which would bring together a large number of individual DTs. Imagine how much more effective our oil rig DT would be if it could connect, for example, with a seabed DT.
We would then be looking at ecosystems of ecosystems. It is hard for me to avoid thinking of the ‘catalogue of catalogues’ that masterful Argentinian writer Jorge Luis Borges describes in his short story The Library of Babel (in which the universe is, fittingly, depicted as an infinite library). It also strikes a chord with my colleague Duncan Robert’s insightful exploration of the increasing role played by the Metaverse in our societies.
A remarkable library - though this is Stuttgart (Germany), not Borges’ Babel; Photo by Niklas Ohlrogge on Unsplash
Such an ecosystem of ecosystems holds enormous potential to maximise efficiency and innovation, and lead to improved decisions in a broad array of areas. The UK is an interesting place to observe this revolution unfold. British organisations have the opportunity to join the National Digital Twin programme run by the Centre for Digital Built Britain (CDBB), a partnership between the University of Cambridge and the Department for Business, Energy and Industrial Strategy.
While an increasing number of companies already use digital twins to extract data about the likely impact of decisions under consideration, the programme seeks to bring them together in a UK-level DT ecosystem to put the right data in the right hands at the right time, leading to better, outcome-based decisions. This can make the UK a more innovative country because, as CDBB Executive Director Alexandra Bolton told us: “If you look at just your piece of the jigsaw, you don’t see the wider picture”.
In some areas, improvements are already taking shape. For example, Cambridge researchers are working with clinicians to make the most of the opportunity provided by the relocation, planned for 2025, of the century-old Moorfields Eye Hospital, in the City of London’s Old Street, to a new site at the St. Pancras Hospital, in the Kings’ Cross area (also in central London).
Efforts can lead, for example, to improved inclusivity for sight-impaired people travelling through a series of transport hubs to the new hospital, better access to services and higher-quality treatment, among other benefits.
Ultimately, humanity needs an international digital twin ecosystem, not least to help it tackle climate change, as former Ocado Chief Technology Officer and DT enthusiast Paul Clarke pointed out in a recent conversation with us.
This would allow humanity to harness data and AI technologies to increase our understanding of our natural world and reduce the impact of human-made systems on it. It would enable us to provide resources to educate and nudge the behaviours and choices of people globally and orchestrate swarms of clean energy-powered smart machines to help monitor, clean-up, re-plant and repair our planet.
Of course, technology can only take us so far. Achieving these and other great things at a planetary scale would require bold political ambition and a much higher level of global cooperation.
For now, challenges abound even at the relatively more modest national scale. The UK's National Digital Twin programme is working with industry, policymakers and academia to create the socio-technical change required: the development of the standards, processes, ways of working, and behaviours that will be needed for the British ecosystem to take shape.
It is surely worth trying. The GPT that digital twin ecosystems may well become are a potential goldmine for existing and aspiring innovators, able to deliver substantial social benefits, help bring about a much-needed productivity boost for the global economy and be an important tool for humankind to address some of its most pressing problems.
Eduardo Plastino is a Director at the Cognizant Centre for the Future of Work.
RSA Fellows are invited to a free 3-hour introduction to copyright seminar, providing a practical overview of need-to-know facts and information for entrepreneurs, creatives and businesses.