Danny Finklestein has an important and well argued piece in this morning's Times. In it he connects evidence taken from sports refereeing showing that false intervention (wrongly giving a foul) is seen as a bigger risk than false non-intervention (wrongly failing to give a foul) to the debate over Syria.
Just about all the worst case predictions of those calling unsuccessfully a year ago for intervention against the Assad regime and in a favour of the more pro-Western rebels have now come to pass: there are over a million child refugees, there have been tens of thousands of deaths, chemical weapons have now been used (probably by the Government) and the rebel side has been hugely infiltrated by forces aligned with Al Qaeda. Yet, no one is calling for a Chilcott inquiry into the failure to act. In the terrible shadow cast by Iraq the option of doing nothing seems to have become the default of both Government and public opinion.
While I have a great deal of sympathy for this attempt to put the ills of intervention and non intervention on an equal footing, the reasons for our bias go beyond mere irrationality. Think here of the 'trolley' problem, versions of which take up so much time in philosophy and social psychology departments.
In the classic example a comparison is made between the public's response to two scenarios. In the first the subject is standing at the points on a track. An unmanned train ('trolley' being the Americanism) is hurtling towards four people strapped to the same track. The subject is asked whether they would switch the points so that the train instead goes down a branch line on which is strapped just one unfortunate. In the second scenario the runaway train is again destined to kill four victims, but in this case the only way to stop it is for the subject - standing in a bridge over the track - to push a fat man off the balustrade into the way of the train.
The fact that a much higher proportion of us will switch the points than will push the man is often used as an example of the non-rational nature of human ethical decision-making. One explanation - which doesn't help with the irrationality - is that having physical contact with someone makes it feel to us like a worse act.
However, there is a quasi-theological defence of our predisposition. The focus here is intentionality and fate. In the case of switching the points we are undertaking an act which is almost certainly going to lead to the death of an innocent person but it is not our intention to do this and we might cling to the blind hope (the theological bent to the argument) that fate/God will intervene and stop the train. In contrast, with the fat man, saving the four people is accomplished as a direct and intentional consequence of our act and - assuming God respects the laws of gravity - there is no possibility of a higher intervention to save our rotund victim.
The cognitive bias which leads us to treat ills that flow from non-intervention more harshly than those which result from intervention leads to judgments which can be illogical, unfair and sometimes disastrous (just ask the Tutsis in Rwanda). However, this bias is not merely dysfunctional; it also speaks to more subtle dilemmas concerning the relationship between intended and unavoidable harms, on the one hand, and, on the other, those in which we are responsible for the cause, whilst hoping that somehow we will not have to be responsible for the effect.
Ian Burbidge on the importance of learning from previous area-based funding initiatives to address inequality across the UK.
A recent workshop with RSA Fellows provided invaluable insight into the key concerns and opportunities facing cultural education workers and employers.