Learning from errors could bring statistics into focus
The Soccer boffin's weekly dose of betting wisdom
Polls for UK elections and referendums come with a claim, which is typically that 19 times out of 20 they will be accurate to within two per cent. Since 1979 the error has been greater than two per cent eight times out of 11. The average discrepancy was five per cent.
Nate Silver, a well-known polls analyst, thinks the real margin of error is much larger than the two per cent claimed. He reckons that 19 times out of 20 UK polls are accurate to within 15 per cent - that is 15 per cent one way or the other.
Polls conducted in the five days after parliament voted for a June 8 general election gave Conservative a lead over Labour that averaged 20 per cent. So perhaps we can conclude that there was a good chance Conservative led Labour by somewhere between five and 35 per cent (20 + 15 = 35, and 20 – 15 = 5).
You might say that is a wide range. I would agree with you - but I would add that I am glad to know.
Last week I wrote that most scientific research published in academic journals is wrong. Researchers usually claim that there is only a one in 20 chance (five per cent) that they could have got their results by accident. The real chance seems to be much higher - in all likelihood more than 50 per cent and potentially a lot more.
When you read a claim of a scientific breakthrough, what should you think? If you know nothing else, you should think: "That might be true, but it is probably not."
We should always want to know what data means. Often what it means is not what it says. Therapists learn to listen for what their client is telling them, which is not the same as hearing the words that they speak.
Assistant referees in the Premier League get 98 per cent of offside calls right. Why are they so good? Because they learn from the calls they get wrong. They watch them back on video and try to figure out why they made the mistake. Then they develop rules of thumb - what psychologists call heuristics.
In a Sky Sports documentary Jamie Carragher and Gary Neville spent a day with Premier League referees and assistants. They were given a flag and told to wave it when they thought a player on a practice pitch was offside. Mike Mullarkey, coach of assistant referees, gave this advice: "It's probably got to look a metre offside to know that it will then be level. So if it looks less than a metre on a quick crossover you can be certain it's gonna be onside."
That was a perfect illustration of how to learn from data - in this case the evidence of your eyes - and understand what it means. You look at a player and think he is a metre offside. Hours in front of a video screen have taught you that this means the attacker was probably level with the last defender, and therefore onside.
I try to do the same kind of thing with football statistics when evaluating bets. When I have a stat for a team I do not take it at face value. That is to say, I do not assume it will be reproduced in the next match. Instead I look through my records for other teams from the past who had similar stats, and then I look at what happened next.
All sorts of things might have happened, and probably for all sorts of reasons. What generally happened? In that way I try to develop an understanding of what the statistic usually means for what happens next.
Michael J Mauboussin is head of global financial strategies at Credit Suisse. In a book called The Success Equation he wrote: "Statistics are widely used in a range of fields. But rarely do the people who use statistics stop and ask how useful they really are."
The most important question about data is rarely asked: "What does it mean?" Often the answer should be: "Not what it says."
Money talks a little less loudly further down the football pyramid
Harry Redknapp was right, up to a point. He said: "It's not that difficult to put together a good team in the Championship, if you know what you're doing. You can go out and pick up players, you haven't got to spend fortunes."
He spoke when he was appointed Birmingham manager for the last three games of the season. He said he hoped to keep them up then stay for next season and push for promotion.
There is always a relationship between pay and performance, but it becomes progressively looser as you move down the football pyramid. And there is a good reason why.
How many footballers are good enough to play for the Premier League winners? How many are good enough to play for the Sky Bet Championship winners? A lot more. Some will be playing already at a higher level in the Premier League, but others will not.
There are more ways of putting together a title-winning team in the Championship than in the Premier League. And there are more ways of assembling a promotion-winning squad in Sky Bet League One than in the Championship. And so on.
Money becomes less important, but not unimportant.
Redknapp won the playoffs in 2014 as manager of Queens Park Rangers, who had the highest wage bill relatively speaking of any Championship club in a decade. It was 15 per cent of the Championship total for that season. One club out of 24 accounted for more than one-seventh of the aggregate payroll.
That was a huge slice, even if it did include promotion bonuses. Much of it was a legacy from the previous season, before Redknapp arrived, when Rangers got bad value for money on signings in the Premier League.
Rank teams from one to 24 on league position and wages. The average league position of Championship promotion-winners in seasons 2006-07 to 2015-16 was between two and three. The average wage rank of those promotion-winners was five. Without bonuses it would have been lower, but perhaps only nine.
Promotion winners tend to have higher than average payrolls. Money matters less, but it still matters.