This note discusses the analysis of the knowledge and experience that you will have gained either directly or via consultation.
You should not, of course, simply accept all the ‘facts’ that are presented to you. Statistics can be particularly misleading.
- Aggregated statistics can look very different to the underlying figures. For instance, vehicle accident statistics generally include young and accident-prone drivers, as well as injuries to pedestrians and cyclists. Indeed, I understand that a middle-aged car driver in good weather may well be just as safe, over most long journeys in the UK, as if he or she were flying, which is a very safe form of transport.
- Those who do not respond to surveys may have very different views to those who do. Imagine a 30 to 10 split "in favour" in responses to a questionnaire. Does this mean that "75% believe that ..." ? Not if the response rate was 40% and almost all the remaining 60 thought not. Matthew Syed reported one example as follows:
- Different organisations will record data in different ways. The classic example is in France where, if an elderly person is found dead without evidence of health problems, it is acceptable to attribute the death to ‘old age’, thus reducing the apparent incidence of heart attacks. But crime statistics can be similarly unreliable, as are many others.
- The fact that there have been no incidents does not mean that something is safe. It is possible that fewer children are now killed on our roads, not because they are inherently safer than decades ago, but rather because they are so dangerous that many children are not allowed near them.
- Death and injury rates can look very different when presented as a number (e.g. number of children killed in an incident) rather than as a proportion of the exposed population per annum.
- A report of deaths caused by, for example, air pollution might include a high proportion of those whose death was already imminent, rather than deaths from amongst an otherwise healthy population.
- Survival rates can be very misleading. Screening for cancer, for instance, often appears to generate a high survival rate (over 5 years, say) compared with the survival rate those whose cancers are detected when symptoms become obvious. But this can be because the time of diagnosis is earlier, so it appears that patients live longer even if treatment is ineffective. Or it can be because the tests also identify slow growing cancers.
- Isolated statistics can give a misleading impression. For instance, the radioactivity of a beach near a nuclear plant may be higher than many others, but is it also lower than other beaches which are nowhere near such a plant?
- Above all, correlation does not prove causation!
Scared Straight! was a documentary that attempted to steer young delinquents off the path of crime by taking them to adult prisons. The idea was that spending an hour with hardened jailbirds in a grimy room would scare them for ever away from a life of crime. And the statistics seemed great, as the 1978 Academy Award-winning film fronted by Peter Falk, of Columbo fame, testified. “80 to 90 per cent of the kids that they send to Rahway [State Prison] go straight after leaving this stage,” he said. “That is an amazing success story.”
But there was a problem. These statistics were based upon sending a questionnaire to parents about whether the behaviour of their children had improved. The “data” then measured the percentage of parents who said that it had improved compared with those who said that it had not. Can you see the flaw? Only the parents who replied to the questionnaire were included in the data. Those who didn’t respond — the majority — were entirely absent.
Consider how this would have distorted the result. It is possible that only the parents of children whose behaviour improved bothered to respond. Parents whose kids continued to behave badly may have thrown the questionnaire in the bin. This is called selection bias, a basic concept in statistical inference. When a controlled trial was conducted, it was revealed that Scared Straight!, far from deterring crime, made it far more likely. The “hidden data” was finally revealed. A programme that actively damaged young people was held in place for decades by flawed methodology.
It is easy, too, to frighten people with ‘science’. 76 per cent of one group of adults, presented with a number of true facts about the chemical di-hydrogen monoxide, concluded that Government should regulate its use. The other 24 per cent presumably knew that the chemical’s other name is ‘water’.
There are many scientific ‘facts'. But research scientists work by publishing their theories and conclusions so that others can test them. Slowly, over time, it becomes clear that what scientists are saying is very likely true, and should be acted upon. But you can always tell good scientists by the way in which they acknowledge any remaining uncertainties, make assumptions explicit, distinguish between what is true and what is speculative, and present options.
Scientists, medics and economists will often report the likelihood of 'Type 1' and 'Type 2' errors - or false positives and false negatives. A test for cancer will, for instance, result in a number of people being told that they may have the disease - when they haven't - and others being told that they appear not to have the disease - when they do. I prefer to avoid such jargon, but this cartoon neatly explains the difference, if you need to remember it:
By the way, do not be tempted, when faced with a hostile press or a one-sided lobby, to assemble your own dodgy statistics – or dodgy science – to fight them off. The inevitable result would be that those with whom you are trying to communicate would then see you as prejudiced and/or adversarial, and you might also then fail to pay insufficient attention to perfectly reasonable arguments from ‘the other side’.
You should also beware relying too heavily on cost benefit analyses and other complex models. They can illuminate difficult decisions, but they are no substitute for careful analysis. And remember that judges, in Judicial Review proceedings, will take no notice of black box models whose workings cannot be explained. As John Kay noted in his FT column, commenting on the decision to proceed with a third Heathrow runway:
Roskill made a pioneering and widely praised attempt to use cost benefit analysis to define the relevant issues. By the time the Airports Commission reported in 2015, this modelling exercise had morphed into a monster, a black box with trailing wires whose processes no one could understand, and which offered endless numbers but no insight. Such over-specified and convoluted models are used as rationalisations for decisions that have in reality been taken on quite different grounds.
The power of modern computing, far from facilitating good decision making, gets in the way. Consultants are dispatched to find supportive numbers. This happened with HS2, the proposed high-speed link to Birmingham, and for years a policy in search of a justification. Competing cost benefit analyses yield the recommendations their sponsors want to hear. We have policy based evidence, not evidence based policy.
These spurious impact assessments provide cover for the increasingly superficial basis on which policy is really made. ... The good policy is one that makes a good headline. We suffer from what King and Crewe describe as a “deficit of deliberation”.
Moving beyond science, models and statistics, you must remember that it is unfortunately in the nature of our society that most correspondents, and most of the people that we meet, will present a one-sided view of an issue, drawing attention to all the relevant facts and arguments which support their case but failing, either deliberately or through sheer conviction, to take account of inconvenient facts or opposing arguments. But as you gain experience, you will quickly learn to detect the pure advocate or bullshitter.
Take care, therefore, not to be too trusting and bear in mind the famous warning that ‘He would say that, wouldn’t he!’ No one who is applying for a grant will tell you that they will in fact go ahead even if they do not get it, and no businessperson will tell you that the principal purpose of their latest acquisition is to build market power. Similarly, most people are reluctant to admit their errors, and their reluctance will be in proportion to the seriousness of their error. Therefore, if you are questioning the propriety of someone’s behaviour, be cautious about attaching significant weight to the views of the person being questioned. Find out the facts and let them speak for themselves.
The same applies, but less strongly, to professional advisers. Lawyers, accountants and merchant bankers are employed by their clients to persuade you to do certain things. They will usually tell you the truth, but seldom the whole truth. They will also sometimes imply, and indeed believe, that their opinion (e.g. about the viability or prospects of a company) is a fact. If your instinct is to the contrary, then rely upon that instinct, at least to the extent of probing further.