• wblogo
  • wblogo
  • wblogo

How to spot signs of over-confidence in your firm's RBA

Peter Wilson, Herminius, Managing partner, London, 26 October 2018

articleimage

When taking a risk-based approach to money laundering, compliance officers often make egregious errors. An understanding of the phenomenon of over-confidence ought to help them work out how often they are likely to be wrong and plan accordingly.

“A little learning is a dangerous thing;
Drink deep, or taste not the Pierian spring:
There shallow draughts intoxicate the brain,
And drinking largely sobers us again”
Alexander Pope

When running workshops for expert clients, I often ask people to rank a selection of countries according to the risk of money being laundered and corruption taking place on their soil. They typically produce very different rankings from one another, even when they hail from the same firm. Many of their answers also differ significantly from the Basel Money Laundering Index or the Transparency International (TI) Corruption Perceptions Index. The differences are not merely a matter of a few positions up or down; one person’s 'high-risk' country is often another person’s (or index’s) 'low-risk' country.
 
This should not be surprising. The Basel index fully acknowledges that we cannot know the actual amount of money laundering that takes place in a country. Instead, we can only guess the amount by looking at all cases that happen to come to light. Transprency International's index is, of course, based solely on perceptions - the clue is in the title. We inhabit a world of judgements and opinion, not facts, about the actual risk to be found in a country or a sector of finance. It is on these shaky foundations that a financial institution must build its "risk-based approach" to compliance. (The obligation for firms to take a risk-based approach to their AML/ATF procedures first appeared in the UK in the Money Laundering Regulations 2007.)

So how often are our judgments right? To be more subtle about it, are we over-confident or under-confident about the accuracy of those judgments? If we make 100 decisions that a country or sector is low risk, or that a potential client can be 'on-boarded' safely, and we are right 90 times out of the 100? Do we know that we have an error rate of 10%? How do we take steps to reduce the error rate should we be proved wrong (which we inevitably will be in some cases)? Or do we assume that our error rate is much lower, perhaps close to zero, so that once we have made a decision we can forget about it and move on?

Psychologists and economists have wrestled with this question and the news is far from good. The relevant syndrome has become known as “expert over-confidence” (not under-confidence) and almost all studies in the psychology lab and in the real world show that people (especially experts) consistently over-estimate their rates of accuracy. In his famous book of 2011, entitled ‘Thinking Fast and Slow,’ the Nobel Prize-winning economist Daniel Kahneman describes overconfidence as “the most significant of the cognitive biases.”

Intriguingly, over-confidence increases as the task becomes more complex. On easy tasks, people are right quite often and there is expectations tend to match results. Indeed, at this level people are sometimes even under-confident. On more difficult tasks, however, people are typically reluctant to reduce their perception of their own accuracy to account for the fact that the questions are growing more difficult and their errors are mounting up, so the element of over-confidence grows.

This is clearly a problem for firms that take a risk-based approach to money-laundering control. We are dealing with highly complex matters where most of the evidence is hidden and prevailing levels of risk can change rapidly in accordance with the actions of a few individuals. We base our guesses on little evidence and we often soon set these guesses in stone as parameters. If we mistakenly believe that our judgment is very likely to be true, we have no incentive to keep it under review or make contingency plans for times when we have been proven wrong.

The really effective compliance officer should therefore be sophisticated not only when judging the amount of risk he should associate with a particular country, sector or client, but also when estimating his own likely rate of error. To use the jargon, he needs to learn to be better 'calibrated.' For example, he should try to reduce the gap between his estimated rate of error and his actual rate, not just by making fewer actual errors but also by reducing his over-confidence.

This is vital for the compliance officer. He only has a problem if he makes an error of judgement and then fails to spot it or offset it. He is not strictly paid to care about the actual prevalence of money launderers in the world if his firm is not taking them on as clients, so it would be good for him to know how often he is likely to encounter a money launderer and to try to reduce the incidence of such encounters.

So what type of people are well-calibrated? A recent study in Harvard Business Review shows that beginners tend to be pretty good. They make plenty of mistakes, but they know it. The danger comes with a little experience. In the Harvard experiment, confidence quickly shot up to an estimated accuracy rate of more than 70%, while the actual accuracy rate remained below 60%. After that, estimated accuracy rates tended to level off in the mid-70s, and actual accuracy increased slightly to the mid 60s, reducing (but not eliminating) the over-confidence gap. The experiment was admittedly an artificial study in a laboratory, but the authors cite evidence that these patterns are repeated in numerous studies in sectors as diverse as financial literacy, airline pilots and surgeons (apparently the most dangerous surgeon is the one on his eleventh operation).

Some of the few experts who are consistently well-calibrated are meteorologists and bridge players. This is telling. The explanation is that they are making a large number of similar judgments every day and are obtaining very quick and direct information that indicates whether they were right or wrong. Such a repetition of similar cases combined with feedback is really the only way of reducing over-confidence. Interestingly, it does a firm no good to give people a financial incentive to be better calibrated, or to lecture them on the evils of over-confidence (so you can stop reading now).

Compliance work, sad to say, rarely involves repetition and direct, fast feedback. Errors may only become apparent over years or decades. Although compliance officers have to do an awful lot of work, their cases are likely to vary substantially. Experience built up in one country, for example Denmark, may not be applicable in another, for example Estonia (to pick two countries at random). In this situation a compliance officer might think of himself as an expert while in fact being in the most dangerous situation of all, holding a small amount of knowledge and a mistaken belief that he is dealing with a situation he has been in before. Banks and investors that have performed well in their home territories and want to 'diversify' their operations elsewhere are likely to be dangerously over-confident. The cursory Googling and record checks associated with limited 'due diligence' may create an illusion of expertise, again placing the compliance officer in the dangerous zone of the semi-expert.

One answer to the problem is to consult a range of long-standing sector and country experts about the behaviour and track records of potential account holders. This is something that my company, Herminius, does for financial firms. People who have "lived and breathed" a country or sector for decades, who have made many judgments and have had time to see which of those judgements were correct, are better calibrated and better able to know how likely they are to be right.

When taking a risk-based approach to money laundering, compliance officers are often wrong. An understanding of the phenomenon of over-confidence ought to help them work out how often they are likely to be wrong and plan accordingly.

* Peter Wilson can be reached at pwilson@herminius.com www.herminius.com

Latest Comment and Analysis

Latest News

Award Winners

Most Read

More Stories

Latest Poll