Trust me, I’m an expert

We all fall for the “trust me, I’m an expert” line once in a while. That’s because the ‘expert’ knows things that we don’t or at least he/she tells us that he does. ‘Experts’ often know how to use tools and techniques to analyse and present information or opinions that the recipients do not, so we are at a disadvantage. The open scientific method relies on some peer review of an analysis and conclusions by other ‘experts’ but the layperson is excluded from this process and just has to trust the ‘experts’.

Most people are familiar with the aphorism “lies, damn lies and statistics”attributed to Disraeli, Bagehot or Balfour (to name a few). In the last couple of days this chart by Reuters using data from the Florida Department of Law Enforcement has been flying around twitter

So what does this chart tell you? At first glance you might judge that gun deaths have fallen since the introduction of the ‘Stand your ground’ law in 2005, after all the graph line makes a steep dip after this point. Now look again. The y-axis scale is one of those Bang & Olufsen things where ‘less is more’ or put simply the scale has been inverted – larger numbers are at the bottom of the scale. Gun deaths actually increased by about 60% after the “Stand your ground” law was enacted. It’s almost beyond belief that anyone who wanted to provide a neutral portrayal of these gun death statistics would create a chart like this.

Business Insider published an article on the relationship between gun deaths and the ‘Stand your ground’ law which included a different presentation of the same data.

Chart reformatted by @PFedewa in style of Reuters original

This version gives a very different first impression.

Now you don’t have to be an expert statistician to see the trend of the results, it certainly appears that the spike in gun deaths is linked to the enactment of legislation. But maybe it’s more complicated? Perhaps there were other variables that had an influence e.g. a big reduction in gun prices, a robbery at an armaments warehouse, a surge in gang warfare (I’m not suggesting that any of these things did actually happen) or maybe several other factors influenced the spike in gun deaths? Not only do you need to be competent in statistical techniques to analyse the data as it becomes more complex but you also need to be committed to an open and impartial assessment of all of the data as opposed to presenting limited data to validate a hypothesis that supported your (or your organisation’s) agenda.

The more complex the data and the more specialised the knowledge needed to analyse the data, the more most of us are inclined to trust the experts. We don’t really have a choice. However when we trust an expert perhaps we should ask who paid them, what agenda they had and who else has peer reviewed their analysis?

I’m not a statistician (even though I did a stats module in my degree 40 years ago) and you are probably not reading this post for my observations on statistical analysis and presentation. You want some maps, don’t you?

There is always plenty of comment from geographers and cartographers (someone more ‘expert’ than me could explain the difference)  about bad maps, atrocious cartography and outright misleading maps. My friend Ken Field has been quite forthright about some examples on his blog Cartonerd particularly on the subject of normalising choropleths, he suggested this article to illustrate a mapping equivalent of the Florida Gun Crime Chart. Back in 2012 the BBC published an article about the geographic distribution of Britain’s medal winners, here is the original map:

Distribution of Olympic 2012 medal winners Source: BBC https://www.bbc.co.uk/news/magazine-19260777

Look at this and you might conclude that the most athletic part of the country was London but The Beeb have made the common mistake of not normalising their data. In this case normalising by population would be the best way of presenting the data i.e comparing medals per head of the population of a town or county rather than absolute values. Ken reworked the data and produced these two maps which tell a somewhat different story to the original BBC map.

Distribution of Olympic 2012 winners normalised by population vs absolute counts. Source: Ken Field https://cartonerd.blogspot.co.uk/2012/08/does-it-matter-if-map-is-wrong.html

Ooh look it seems that the highest proportion of Olympic winners by population come from north Wales and Scotland rather than London although in absolute terms the largest number of medals were won by Londoners.

I doubt that the BBC would claim to be cartographic experts. Unfortunately most people don’t know that, they rely upon and trust the BBC as a source of truth, they instinctively accept the maps on the BBC web site as being correct. I would guess that most people see maps on the BBC web site, or any other media or publication, and because of a lack of understanding about how they are made and what might be wrong with them, they assume that the maps are accurate and offer a valid interpretation/presentation of the underlying data.

So the next time someone presents you with a map that seems to ‘prove’ something or tells a story before you accept it you might want to ask a few questions of the author and if you can’t ask those questions at least remember that the author might not be as expert as he claims to be.

When I asked Ken Field for an example to quote in this post, he kindly offered to make a map out of the Florida Gun Crime chart at the top of this post, here it is

 

Thanks to Kenneth Field

 Sweet, thanks Ken