The Mathematics of Integrity

 

 
8 December 2017
By Jonathan Wolff

 

 

The Mathematics of Integrity

 

Integrity, considered at the level of an individual, is a virtue. But what is to be a virtuous person? Consider how three types of people might respond to the dilemma of whether or not to take a bribe. The first type of person calculates the benefits and costs of the alternatives and decides to follow whatever best furthers their self-interest, which in this case, let us suppose, means taking the bribe. The second understands that self-interest will be best served by taking the bribe, at least in this case, but nevertheless has the strength of character to do the right thing and decline. For this person morality overrides self-interest. But the third type of person – the truly virtuous – doesn’t even do the calculation. Morality ‘silences’ self-interest. It just doesn’t occur to them to consider whether it is worth taking the bribe. And this we can say, is the person of true integrity.

Cynics, and economists if the categories are distinct, may suggest that the distinction is illusory. All actors are – must be – following self-interest in their own different way. The first obviously so. The second has a powerful interest, perhaps based on psychological comfort, in following the rules, and when the mental costs of breaking them are factored in too, the scales are tipped. And the third is just like the second, but the calculation is automatic or unconscious because the mental costs of accepting the bribe would be so high.

Certainly, it is possible to describe the different cases in this way. But what distinguishes them may be, at least for our purposes, more important than what they have in common. For one, although not the only, strand in considering integrity at the level of an institution, is working out how to design it so that people are instinctively motivated to follow the rules, without even considering whether it would profit them to break them. In a system of integrity, taking a side-profit becomes unthinkable.

In saying this I might be challenged from a different direction, based on a paradoxical claim discussed by Robert Klitgaard, that the optimal level of corruption in an organisation is not zero. What is meant is that if you are trying to clean up a corrupt organisation, there comes a time when you will have reached the point ‘enough is enough’. Chasing down the last pockets of minor corruption could be so expensive, and intrusive in individual lives, that it is a waste of energy and resources, given the minor benefits to be gained. It would be better to spend time and money on other priorities. Hence living with some corruption could be an optimal use of resources. However, this assumes that we are starting from a very corrupt situation. It would be very odd to criticise a completely clean, yet lightly policed, organisation for having a lower than optimal level of corruption.

How, though, can you build a system of integrity where breaking the rules is unthinkable? That will rely on many factors, no doubt. But here is a clue about how to make breaking the rules at least not worth thinking about.

Suppose you have a safe and three guards. How do you set things up to minimize the chances that the guards will rob the safe? If there is one lock and each guard has their own key, then it only needs one of three to fall to temptation. Suppose each guard is likely to suffer from temptation one day each year. With one lock and the same key, you can expect to be robbed every four months, on average. But if there are three locks, with different keys, all three guards must fall to temptation at the same time. If the guards act independently, and a single guard will fall to temptation again just one day a year, then the chances of all three coinciding on falling to temptation on the same day is 1 in 365 to the power of 3 (rather than divided by 3), or approximately once in every 133,000 years. This seems an incredibly powerful result.

Of course, though, this is a very artificial example. Corruption in a group is not a matter of coincidence of temptation, but influence and pressure. Still, the general point remains. There are situations in which you are as weak as your weakest point, and there are others where you are as strong as your strongest point. The trick of institutional design to resist corruption is to build structures that are analogous to a safe with three locks. This is sometimes put in terms of the need to divide tasks, so that no one individual has final sign-off control, and therefore corruption requires collusion. If there is mutual influence, and, a charismatic and influential potential rule-breaker, then bigger groups than three may well be needed, but cases will differ.

Notice, also, that where systems are successfully designed to avoid rule-breaking, then it is wasted time and effort for an individual to consider breaking the rules, other than as an idle daydream. The person determined to find a way to be corrupt will need to leave and join a different, more vulnerable, organisation, which is one way in which systems of integrity achieve stability. It is too much to say that systems of integrity will generate the virtue of integrity in individuals. But they may well encourage it, and will certainly encourage people to act as a virtuous person of integrity would do.

References

The discussion in the first paragraph is inspired by John McDowell “Are Moral Requirements Hypothetical Imperatives?” Proceedings of the Aristotelian Society, Supplementary Volume, Vol. 52 (1978), pp. 13-29.

The safe and lock example is taken from Joseph S. Fulda “The Mathematical Pull of Temptation” Mind 101 (402):305-307 (1992).

For the example of dividing tasks, and the “optimal corruption” argument see Robert Klitgaard Controlling Corruption (Berkeley: University of California Press, 1988).

Jonathan Wolff is the Blavatnik Chair in Public Policy in association with Wolfson College. He was formerly Professor of Philosophy and Dean of Arts and Humanities at UCL. He writes a monthly column on higher education for The Guardian.