When police.uk launched earlier this week I wanted to write about the debacle with it getting overloaded, the aggregation of the data, the approach to visualisation, the openness or otherwise of the data, misrepresentation and misunderstanding, the motivations behind it and some possible outcomes and what this all might mean for open data. But it has been a busy week and by now it seems that nearly everything that could be said has already been said.
Here is a selection of the blogs and news with a few supplementary thoughts from me.
The geofolk were understandably concerned about the poor presentation of the data and the misunderstandings that would be created due to the somewhat crude method of aggregation. Alex Singleton wrote
“…there appear to be some serious representational issues in this new mapping system which are not clearly documented and could be very misleading for the ill informed …
Outside of issues related to how you appropriately position a point for very long road, if the street is going to be the aggregating unit for the data, then this should also be used for the visualization. For example, roads could have been variably colored for different rates of crime (rates not counts… this is another representation issue entirely!!)…
The problem with this website as it stands is that crimes are easily misinterpreted as happening at a very specific locations…
These basic representational issues are typically covered in an undergraduate syllabus with a GIS component.”
Ken Field on the BCS blog added
“This sort of utterly inept mapping is precisely what enrages cartographers. Representing data effectively to report geography properly and communicate reality is what we do. It is also what any basic GIS course or cartography course would teach. The annoyance is borne of the frustration of seeing a site that will now be viewed widely be completely misinterpreted because of erroneous mapping techniques…
The use of totals and a single point is poor to say the least. If the desire is to build in fuzziness to deal with the privacy issue then use rates or proportions; use areas instead of points; or use some form of road-based linear symbol that varies by thickness or colour.”
That just about sums up the concerns of many who have an understanding of the data and the techniques available. I know that several people within the policing community expressed concerns about the aggregation methodology being used and the potential consequences but in the current climate of public service bashing no one wanted to stick their head above the parapet.
Several people have commented on the alleged £300k spent with Rock Kitchen Harris, the advertising agency who built the site including The Register and Freesteel. Incidentally, did anyone spot the ITT for this site? I wonder who else bid and what kind of approach they would have taken, or was this a stitch up?
£300k does sound like a lot of dosh for what should be a relatively simple backend aggregating data from 40 odd police forces that should be coming in a standard data model and then representing as clustered points on top of Google Maps. There are several companies out there that could have built this for a fraction of the cost including one I am associated with. The Register points at an alternative site based on the data API called Crimesearch, apparently the developer knocked it up in a day – to be honest it looks like it! This site shows even less than the official site and the points aren’t even in the right place, it is very very poor. If anything this shows why you do need to spend time and money in presenting this data properly. Hopefully someone will do something better with this open data to demonstrate what could have been achieved.
The other strand of comment has been about whether the publication of the aggregated crime data is a good thing or not. At a general level I think government has to be praised for making more data open to citizens and developers who may find innovative ways to use it. However in this case I think it is reasonable to question the motivation behind the high profile launch of the site (which lead to it overloading and crashing) and the questionable aggregation and misrepresentation underpinning it.
Simon Jenkins in the Guardian wrote
“Someone must have gone up to the home secretary in the corridor and murmured she could get some good publicity by announcing “the meanest street in Britain”, even if it did turn out to be a residential backwater in Preston…
I note that May and her officials censored more delicate information, such as of white-collar crime. … To the Home Office, theft and antisocial behaviour are what the poor do to the rich, not the rich to the poor. The map is seriously rightwing…
The truth is that there is information, useless information and Whitehall statistics. The crime map displays the same bureaucratic syndrome as has blighted Britain’s schools and hospitals since the 1990s.”
I like the concept of a right wing map 🙂
Adrian Short commented
Perhaps apocryphally, Stalin said that it’s not who votes that counts but who counts the votes. Likewise, we should be hugely cautious about giving too much weight to official visualisations of data. As the policing minister Nick Herbert wrote today (my emphasis):
We live in the age of accountability and transparency. The public deserve to know what is happening on their streets, and they want action. By opening up this information, and allowing the public to elect Police and Crime Commissioners, we are giving people real power – and strengthening the fight against crime.
So what we’re looking at here isn’t a value-neutral scientific exercise in helping people to live their daily lives a little more easily, it’s an explicitly political attempt to shape the terms of a debate around the most fundamental changes in British policing in our lifetimes.
Transparency isn’t wrong. It’s absolutely vital to make a meaningful contribution to public debate, but we need to distinguish pseudo-transparency from the real thing.
I think they both make some good points.
League tables may not in themselves help to reduce crime and deluging the police with conflicting requests to “put more resources into patrolling my neighbourhood” may not result in any overall net improvement (particularly with fron line resources coming under pressure while the HO spends money on sites like police.uk). However, there is a definite need for better publicly available information on crime and a map can be a helpful way to present the data.
Small area statistics normalised for population are a pretty good way of presenting a balanced view of crime, particularly if you could get some sense of comparatives with other areas, have a look at the Met Police site, not perfect by any means but a better overview than a mass of aggregated pins. If someone could mash that with the more granular aggregated “crime points” so people could drill in for more detail, maybe that would be a quick improvement.
Long term the presentation of crime statistics needs to be thought through by people who understand the data and how to present it in ways that will not cause a flurry of misleading news about the most dangerous street in Preston or Plymouth or wherever. I wonder how long will it be before a minister or friend of finds their house price blighted because it is mistakenly labelled as a crime blackspot? I might be a bit upset if I lived in Lancelot Place in Knightsbridge for example.
Postscript If you want to see the grand daddy of crime maps, have a look at Chicago Crime Maps and note the clear statement of what they are doing in the About
2 thoughts on “Lies, damn lies and Crime Maps”
Some reasons why the site fell over when it launched (apart from stupid publicity on national News)
https://www.brunton-spall.co.uk/post/2011/02/04/failure-at-scale/
No-one seems too happy with these maps for a variety of reasons, but at least the underlying data is (quite rightly) available for anyone who wants to try a different approach – see for instance Placr’s take, which I spotted via Ed Parson’s blog: https://apps.placr.co.uk/crime/
For me, it’s the £300k price tag that’s just staggering. The joy of ‘Web 2.0’ platforms is that they are collections of open and inexpensive software and services that make it *easy and cheap* to start new sites, services, products and companies (I can attest to all of this). If you’re going to use Google Maps to visualise data, you really need to take a leaf from the same company’s approach to building services: ‘release early, release often’, ‘fail well by listening, learning and iterating’ and (more controversially 🙂 ‘get the community to do lots of your work for you’.
My guess is that the budget for this is now spent, the service launched in a big (crash and) bang, and that’s it…delivered, done. It’s how software get procured and delivered, right? I just wonder what the data API released with a call for innovations and a small prize fund might have achieved instead.