Accessibility links

With regret I’ve come to realise I’m neither interesting nor entertaining enough to be picked as a mainstream media presenter.

Given that I enjoy broadcasting, my response has been to try to develop new formats. After all, if I help design the show it seems only fair that I present it. Sometimes these programmes have succeeded but as with most new formats the risks of falling flat are high. It was my fascination with failure that kept me watching a recent BBC documentary, but the show was significant for a more important reason. 

The programme in question was ‘The Secret Science of Pop’  (first shown in February but repeated last week) and it was presented by the very engaging Professor Armand Leroi from Imperial College. The concept behind the show was compelling. Professor Leroi and a group of his brightest students were going to use big data analysis and machine learning not only to discover the secret of pop success but to help an unsigned artist produce a guaranteed hit. To my growing viewing delight, the project was an abject failure.

The research team analysed all 17,000 songs that have entered the charts in the last sixty years deriving over a million bits of data from each track. Applying machine learning to this data they set out to define the recipe for a hit song. Sadly, no such recipe exists, or, if it does not in a form that can be revealed by data analysis and machine learning. Despite trying out innumerable variables, the team came to only one deeply underwhelming conclusion: at the time they are released chart toppers are slightly – at a very low level of statistical significance – more typical of the tracks in the rest of the chart than are other less successful songs.

‘Armed’ (which feels like a rather strong word) with this ‘insight’ (that too) the team worked with a veteran pop producer Trevor Horn to help their unsigned artist, Nike Jemiyo, turn her ballad into a sure fire hit. The song was slow and poignant. So was the programme. Each attempt to produce a sure fire hit sounded more awful and, worst of all (by this time I was starting to cover my eyes) they even failed to achieve the basic task of making it more statistically average.

Finally, with the programme’s core conceit collapsing all around the brave Professor, the programme tried to engage the viewer with a visualisation of the history of pop. But not only did the waves of colour coded data points not communicate much in the way of useful information but – and here the words ‘hole’ ‘stop’ and ‘digging’ sprang to mind – the game Professor - who had told us winningly at the outset that he knew nothing about pop - confirmed this in spades by claiming the data showed; (a) the Beatles were irrelevant because their music was average for its era; while (b) punk was also unimportant although – please don’t ask me why – for the reverse reason.

The great reassurance I got from ‘The Secret Science of Pop’ is that it is possible for a programme concept to collapse like a gazebo in a gale and yet be watchable. I can see myself sharing this comfort with producers for as long as commissioners are foolhardy enough to put a microphone in front of me.

The programme ended with Professor Leroi recognising that, while the experiment hadn’t been entirely successful, it was only a matter of time until data and algorithms solved the puzzle of pop success. Surely the bigger lesson of the programme is that he may be wrong.

As Gavin Kelly has eloquently argued, the constant stream of breathless and weakly substantiated predictions about the disruptive impact of AI and robotics distracts us from the important issues facing the world of work today. These flaky futurists often lack realism and nuance about what can be automated. The lesson of Professor Lerioi’s travails is not just that pop success is more difficult to analyse than we thought but that when it comes to culture – the evolving collective expression of appreciation and emotion – the point when we can reduce it to zeros and ones is not just far away but over the horizon.

In our own work on automation the RSA is focussing on a granular account of how technology may impact sectors, jobs and tasks, informed by engagement with employers and employees as well as technologists and entrepreneurs. One headline of our soon to be published report - based on a major survey of business take up of AI and robotics – is that we are currently too alarmist in thinking about technology but too timid in actually taking it up.

The case for getting behind the headlines to look at what big data and machine learning can actually do was underlined by the efforts of the Professor and his diligent but frustrated team. The show also reminded us of how hard it is to code for human taste. After all, what algorithm would predict that a programme which raised the bar only to knock it over again and again could be so damn entertaining?  


Join the discussion

Please login to post a comment or reply.

Don't have an account? Click here to register.