Posted: April 23rd, 2012 | Author: steven | Filed under: Open Data, Value | 2 Comments »
Open - Thanks to mag3737 http://www.flickr.com/photos/mag3737/
Last week the NAO published its Implementing Transparency report.
The objectives of transparency are summarised in the report as accountability, service improvement based on user choice and comparative data and the much sought but elusive economic growth from new products and services based on OpenData.
The Guardian in an uncharacteristically scathing piece summarised the report.
“Read between the lines of its [the NAO] report out today, Implementing Transparency, and you will see a government which has been chucking out tonnes of data, that no-one looks at and without a complete strategy. Oh and it’s cost an awful lot of money.”
Fair comment? I think so. It certainly seems that the rhetoric of transparency and accountability (possibly linked to the underlying ideology of reducing the size of government) has held sway over the bean counters in the Treasury. Of course not all of the benefits of transparency can be put into a profit and loss account but it does appear that common sense and value for money have not been important criteria in determining what data should be prioritised for release.
“As the scope of the transparency agenda has developed, the Cabinet Office has published examples of the benefits of public data initiatives to support the strategic case for transparency, for example on its Open for Business website, but has not yet systematically assessed the costs and benefits of the Government’s specific transparency initiatives.” (NAO)
Concern is raised about the quality, completeness and suitability of the data that has been published to inform better choice and accountability. I have ranted extensively about the publication of crime statistics (here, here and here) and questioned whether accountability will work to the benefit of all groups in society.
“With regards to community accountability, the police crime map provides much more detailed recorded crime information than was previously available. However, additional information is still needed, for example on police activity and resourcing locally, for residents to hold neighbourhood police services to account more fully.” (NAO)
There has been little attempt to measure or assess the benefits of transparency apart from some limited statistics on website visits. We learn that data.gov.uk has had 1.75m visits since its launch in January 2010, unfortunately over 80% of users “leave from either the home page or the data page on the website. This suggests that they are not accessing data during their visit ..” One of the highlights quoted is the 47m visits to police.uk between February and November 2011 which sounds pretty impressive until the Guardian pours some cold water over this OpenData triumph.
“We were interested in that 47m figure for the crime maps site and tested it using Nielsen data. There is no guidance on what exactly constitutes ‘visits’ – is it page views or unique users? Our figures show that while the site did get a lot of visitors when it was initially launched in February last year – and had a brief peak during the England riots last year (ironically, the data on the site is all historical, so visitors looking for riot offences would have been disappointed), in December it appeared to only have 47,000 viewers, looking at 364,000 pages”
In the interest of balance, it should be noted that access to the Depart for Education’s schools performance tool (note the DfE’s tool not an application built by the private sector) has increased by 84% in the period under review.
The report highlights three areas of risk – privacy, fraud and unintended consequences.
The privacy concerns have been extensively aired and it seems that just removing names and addresses or some simple aggregation will not prevent data being cross tabulated with other sources to deanonymise. The government has commissioned an independent review into privacy and transparency, it will be interesting to see what response this review gets from the Cabinet Office particularly with regard to the NHS data being offered to big pharmas.
Whilst the economic benefits of OpenData have been slow to materialise within the private sector apparently some enterprising villains have already identified new opportunities to defraud the public sector using OpenData
“… with increased transparency around contracts and payment details – fraud attempts to a value of £7 million directly related to transparency releases have been found in local government …” (NAO)
On the economic front government is flaunting a new, improved and bigger estimate of the economic contribution of public data is £16bn. The source for this estimate is (once again) based upon the opening up of geographic data in New Zealand
“A recent review of the literature on reusing public sector information put the value of public sector information in the European Union at around €140 billion a year. The author based the estimates on extrapolating from studies of the total economic impacts of geospatial information in Australia and New Zealand. Based on this review, the Government derived an estimate of £16 billion for the current total economic value of UK public sector information.” (NAO)
This sounds like “Pollock squared” to those of us of a sceptical bent and the NAO goes on to point out an alternative much lower estimate from the OFT
“The Office of Fair Trading produced an earlier study in 2006 on the commercial use of public sector information. They surveyed more than 400 public bodies to identify the income generated from public data release. They also commissioned research to estimate the economic value of UK public sector information. Based primarily on the survey results, the contractors estimated this value to be about £590 million in 2005.”
“Both studies have limitations. The Office of Fair Trading report notes that top-down approaches, such as that used in the EU-wide estimate, tend to overstate the economic value of public service information. This is because they do not factor in reasonable substitutes available if that information does not exist or is prohibitively expensive. Furthermore, the assumption of similar public sector information markets is crude given significant known differences between countries. However, the Office of Fair Trading report is likely to understate the economic value of public sector information.” (NAO)
So we don’t really have a good idea of the potential economic benefit. Fair enough, this is part of the “new economy” we are sailing in uncharted waters and should have some faith that economic benefit will follow on from the release of OpenData. But surely that would suggest a phased approach to the program of releasing OpenData allowing government to assess the benefits from accountability and new businesses along the way and to keep some control over costs? data.gov.uk currently costs £2m p.a. to run, there are also departmental costs to release data.
“The costs [per department] range from £53,000 to £500,000. These represent a lower bound for the cost of standard releases because they only capture staff costs and do not include, for example, costs of upgrading IT systems or payments to contractors.” (NAO)
No doubt my more technical friends will ridicule the higher end costs to publish data but inevitably there are staff costs involved in identifying, extracting and preparing data for publication. Anonymisation of personal data content could prove to be even more costly particularly if private sector consultants are needed.
The benefits sought from the transparency agenda, accountability and service improvement, are almost unarguable and are probably worth the £15m a year we may be spending on releasing data (2m for data.gov.uk plus 25 odd departments at 250k and perhaps the same again for Local Government and others). But let’s not be kidded by the unproven economic arguments of OpenData champions in the academic and startup communities into believing that any data requested should be released regardless of the cost or anticipated usage.
If this were a school report (we would probably be at the end of the first year of secondary school) I can imagine the tone of the head of year’s summary
“Francis has made a good start at the OpenData Academy. He is an enthusiastic pupil who has displayed a positive attitude to his subject but needs to pay more attention to measuring the usage and benefits of OpenData. In the coming year he should reinforce his enthusiasm for transparency and accountability with a sound economic framework that does not depend upon unsubstantiated forecasts of an economic miracle (advice that I also gave to his cousin George).”
Posted: April 4th, 2012 | Author: steven | Filed under: Open Data, Open Source, Open Standards, OSM | 3 Comments »
Open or Closed or a bit of both? Thanks to wiccked http://www.flickr.com/photos/wiccked/
Open Source Geo and Open Geospatial Consortium Standards have been active for over a decade, OpenStreetMap since 2004 and OpenData is the new kid on the block. But something seems to have shifted, it seems that you can barely go for a day in the UK geoworld without stumbling on an event, an article, a vendor or consultant talking about Open something. Why has Open become the badge that everyone wants to flaunt?
Not everyone who claims to be Open actually is but with loads of different connotations to the Open badge there is plenty of room for interpretation and argument.
I am going to be exploring whether Open Source, Standards, Data and StreetMap achieved critical mass and their interdependence. at the AGI’s Northern Geoconference on the 3rd of May at the Manchester Museum which sounds like a fun venue.
Hopefully I will stir up some debate about being Open, if you have any thoughts before the 3rd share them with me here and in the spirit of openness I will give you a shout out at the event.
Posted: April 4th, 2012 | Author: steven | Filed under: GIScussions | Comments Off
Thanks to Caro Wallis http://www.flickr.com/photos/carowallis1/
Those of you have been avid followers of my blog will have noticed the paucity of posts for the last couple of months. I have been juggling wrapping up two big client projects (now completed) with some quite difficult family problems (as my cousin said “at our age we are the meat in the sandwich”).
Normal geobabbling and ranting service will now resume with some staccato posts on topics that have been on the back burner for a while.
Posted: January 11th, 2012 | Author: steven | Filed under: Business, geo.me, Google, OSM, Value | 5 Comments »
Free, thanks to Brad Stabler http://www.flickr.com/photos/bstabler/
Over the holidays there was a flurry of excitement, particularly among OpenStreetMap fans, prompted by Ed Freyfogle’s announcement that Nestoria were switching from using the Google Maps API to using an OpenStreetMap tile service provided by MapQuest. The switch was prompted by Google’s announcement in April of last year that they would be introducing some volume limits on the usage of the free Maps API:
“We are also introducing transaction limits on the number of maps that may be generated per day by commercial Maps API web sites or applications. These limits are documented in the Maps API FAQ and will take effect on October 1st 2011. Commercial sites that exceed these limits may be required to purchase additional transactions or a Maps API Premier license…
…Not for profit applications and applications deemed in the public interest (as determined by Google at its discretion) are not subject to these limits.” (My italics)
Nestoria have been a long term supporter of OSM and as Ed says the introduction of the new charging regime was the trigger to move away from Google:
“Having always envisioned that we would someday move to OSM, this was the nudge that pushed us over the cliff.”
The new limit is 25,000 map sessions per day averaged over a quarter, which equates to 9m sessions per year. Google expect 0.35% of the sites using the free Maps API to be impacted which I reckon is between 1200 and 1400 sites. Big respect to the Nestoria guys for growing their business to this level but perhaps a note of concern for the many passionate advocates of geo (myself included) on the economics of incorporating map services in applications where revenue per page is on the low end of the spectrum.
I don’t know why Google chose to introduce these limits, it could be about the costs of delivering the service (hardware, bandwidth/power or data licenses where Google is using 3rd parties) or it could be a drive by the commercial team to increase sales of Enterprise licenses. Probably a bit of both. It sounds like the commercial approach may have been a bit less than optimal but no doubt Google will adjust their approach (and possibly their pricing) if they want to retain some of these large usage sites or we may see the more aggressive introduction of context sensitive advertising directly into the map although we have heard little more about the trials of map ads in Australia.
OSM advocates have been understandably chuffed that Nestoria considered OpenStreetMap to be as good or better than that offered by Google and it’s data providers, no doubt OSM has come of age as a viable royalty free source of geodata. In my opinion the focus on open vs traditional geodata ignores the broader scope of Google’s Map API which offers quite a bit more than just a map tile service – geocoding, directions, Streetview, local search and perhaps most important colossal scalability. What interests me about Nestoria’s decision is the choice of MapQuest rather than going with a self hosted tile service based on OSM which they explained:
“When we realized it was time for us to make the move we faced one big decision – should we use someone else’s OSM tiles or should we render and serve our own? We called in an expert to advise us. OSM expert, and former Nestoria blog interviewee, Andy Allan … Rendering has the advantage that you can make the map look exactly the way you want. When done well this can produce phenomenal results … but unfortunately it’s no small technical undertaking, especially when we’ve also got a property search engine to run.
We concluded the only viable path was for us to leave the rendering and serving to experts and use someone else’s OSM tileset… Luckily however several companies have stepped in to fill this gap -CloudMade has for several years offered an OSM tile layer for all to use. In 2010 MapQuest released a similar service. While we are longtime fans of CloudMade (we use their tiles on our Where Can I Live? service), for their global infrastrucutre and speed we decided we’d prefer to use MapQuest’s OSM tiles.
What’s in this for MapQuest? I imagine that the publicity and goodwill are worth quite a lot to them (but that was what many people thought was the motivation behind Google’s free API for the last 6 years). Presumably the cost and complexity that Nestoria identified are shared across a large user base for the tile service but as uptake grows the costs of servers and bandwidth must become a factor in the economics of their free tile service. I wonder how long it will be before the bosses at AOL (ultimately the very commercially astute Arrianna Huffington) start looking to monetise the usage of their infrastructure? Will it be “in map advertising”, alongside the map advertising or is their some other model? At least if you have architected your service in the way that Nestoria have, you can quite easily switch from the MapQuest service to someone else’s if the terms change in the future.
I can imagine a scenario where a number of the 1200 odd commercial sites switch to MapQuest or Cloudmade only to find that these guys with much shallower pockets than Google will struggle to support their levels of usage and functional needs (routing requires a quite a lot of processing for example) on a free model. Of course they can then start to roll their own map services with something like MapBox or building their own tile service and hosting it in the cloud but then they will start to incur the direct costs plus the overheads of supporting their service. The guys from MapBox promoted their service as an alternative on twitter:
“With our new add-on packages, you can easily bump up your storage + bandwidth ds.io/A12Drf And it’s 12x cheaper than Google Maps”
When I queried this they quoted their cost per thousand extra map views at $0.32 compared to Google’s $4.00 but I think that ignores the Google’s free service offering 2.25 maps per quarter before you hit the charge or the need to have quite a lot of technical skill and understanding to set up a tile service. That said MapBox is an interesting open source based service that woud appear to offer a low cost solution to serving tiles with the benefit that you can style those tiles to your own designs. I’d be interested to understand the economics of this kind of business model, presumably over time it will get driven down to pretty low prices but will never be free. Look at web hosting, prices for a small site ca be as low as £2/month and you can get quite a powerful server from Amazon for about £35/month but as far as I know no one is offering large chunks of infrastructure and bandwidth for nothing.
So is this the beginning of the end of Free? Not quite yet, but maybe it’s time to recognise that a free ride may not last for ever and perhaps we need to think a bit about what price we are willing to pay for a map service. This comment on the Nestoria blog from another business affected by the change in Google’s usage terms summed it up for me:
“In our case, that meant paying $200,000-$400,000 a year for maps…
Google got their pricing off by at least an order of magnitude. I don’t know many companies making more than a couple of dollars per thousand views. Had they charged us $0.20 per 1,000, we would have gladly paid.
In the end pricing is a trade off between the value to the purchaser and the costs of production/delivery, even competitive pressures cannot reduce long term costs to zero unless the providers find a different revenue model, which could bring us back to “in map adverts” and I doubt many large commercial users of maps would want to be potentially hosting adverts for their competitors in return for free or very low cost maps.
I wonder whether within a couple of years a modest fee say $250-500 per million maps will become the going rate for a high availability map service with some additional costs for directions and other features. Time will tell.
Posted: January 9th, 2012 | Author: steven | Filed under: Crowd, Open Data, Usability, Value | Comments Off
Yup, I'm grumpy. Thx to Chris JD http://www.flickr.com/photos/chris_jd/
Last year I commented on the AA’s Streetwatch survey of potholes questioning the methodology used.
I contacted the AA to see whether they would let me have a look at their data and see whether there were any other conclusions that could be drawn from this exercise in Citizen Science. After several mails and chasing for replies I eventually got a reply:
“This survey was never intended to be a scientific investigation of local road and footpath problems, their possible causes or reasons for improvement or deterioration. It was, as you allude to, a ‘big society’ approach to reviewing local issues around our AA Streetwatchers neighbourhoods. As you make clear there are a number of ways of carrying out such work but our approach was based on our AA Streetwatchers largely doing this themselves and us reporting their findings. The results merely provide a ‘snapshot’. Unlike the AA Streetwatch 1 survey this time we reported the observations by regions – hence the regional rather than postcode maps. This is only our third survey and our aim is to increase participation which we hope will produce a useful ‘dip-check’ of local conditions. For your information the numbers reported are just counts (with obvious error reports removed).”
Note – no offer to provide the data for alternative interpretation
Fair play to the AA, they accept that this was not a “scientific investigation”. But in their original press releases the AA talks about a “deluge” and “plague of potholes” and the summary of the report gives an indication of precision which they cannot back up with sample selection or size or methodology. That would seem to me to be claiming some kind of serious or scientific value to their “snapshot”, I doubt they will put out a further press release to correct any misinterpretation that may have occurred.
So next time the AA gathers a tiny sample of subjective data on road condition, I hope they will be clear that it is not intended to be a scientific investigation and that the results are a snapshot which may not provide a balanced or meaningful assessment of road condition or the effectiveness of road maintenance processes relative to weather and budget. But then the exercise wouldn’t get much publicity or serve whatever agenda the AA may be seeking to promote.
Posted: January 6th, 2012 | Author: steven | Filed under: Derived Data, Local Government, Open Source, Value | 1 Comment »
I should have been looking where I was driving not at my satnav. Thanks to http://www.flickr.com/photos/endlisnis/
I was just thinking it’s a while since I have been prompted to write a blog post and then this morning the Today program has a short feature on Norman Bakers proposed Satnav Summit and I’m awake and frothing (awful thought that you may not want to dwell on for too long).
Apparently while the austerity agenda is savaging public services and job prospects across private and public sector look grim, DfT think we need a “summit” to resolve problems with satnav systems.
“Norman Baker will host the Government’s first ‘Satnav Summit’ to thrash out solutions to end the misery caused when lorry and car drivers follow out of date directions from their satnavs.”
Now when did the government get involved in the “misery” caused by drivers following out of date road atlases or A-Z’s? Is anyone really “miserable” or is this an attention grabbing initiative prompted by hype and anecdote?
Of course the prevalence of satnav within cars and trucks does present an opportunity for smarter journey routes and could potentially minimise the the occasional truck that gets stuck under a bridge or in a too narrow village. A good starting point would be for highways authorities to publish a simple data set of bridge heights and perhaps one of width restrictions, it would be relatively simple to conflate these with the map data used within satnav systems and would avoid the need for the TomToms and Nokias to try to capture and maintain this data. Ahh but that would require yet more #opendata and quite possible would stumble on the dreaded derived data the moment the data got into the hands of commercial navigation companies. Note to Mr Baker, you may want to discuss this with Mr Maude, Mr Cable and Mr Osborne prior to your summit.
The bigger challenge is how to manage the complexity of discouraging (or even restricting) drivers of cars and trucks from using routes through villages or residential areas to avoid traffic delays. Apparently Highways Authorities are going to have new powers to reclassify roads and DfT thinks this will enable them to divert traffic flows.
” (DfT) are allowing local authorities to reclassify roads – ensuring A roads are placed where they want traffic to run and lowering the category of road in places they want traffic to avoid”
In London we are plagued by speed bumps and width restrictions which have been installed for safety reasons and to discourage traffic from using our residential streets as cut throughs, if you are an ambulance or fire engine driver or a local resident you may think these physical measures are a mixed blessing. It will be interesting to see how they propose to manage the difference between someone wanting to drive through a village or restricted area because they think it’s a time saving route and someone who wants to drive through that village as a tourist, possibly stopping to spend money with local businesses. I’m not sure how the satnav suppliers can be held responsible for people choosing to drive on routes that cause “misery” to local residents or how they could implement any local authority applied restrictions without hampering essential services and deliveries from making use of satnav systems.
Surely when authorities try to divert vehicles away from a route or area they are just shifting the problem to somewhere else? Won’t the people who live or drive on or near the rerouted ratrun be the new recipients of “misery”? Isn’t this a zero sum game?
I guess this will play well with rural communities but may not have much relevance to those of us living in the cities. Now what are the voting patterns of those who are suffering the “misery” of satnav misdirection?
Posted: November 28th, 2011 | Author: steven | Filed under: Crowd, Local Government, Open Data | 2 Comments »
On pothole patrol. Thanks to Amanda Slater http://www.flickr.com/photos/pikerslanefarm/
Potholes are one of those things that get everyone united in righteous indignation “Harumph, they should do something about this!” No one is going to suggest that potholes are desirable but are we too eager to bash the local authorities who are tasked with spending our taxes to keep the roads in good condition?
The Telegraph and several other papers reported last week on the “plague of potholes” identified by the AA’s recent Streetwatch Survey. The report summarises the results of a 1000 informal surveys undertaken by volunteers walking their local areas and recording potholes, repaired potholes and potholes marked for repair (I’m not sure how you would identify a repaired pothole) along a 30-60 minute walk. The report includes a load of simple thematic maps which take a bit of time to understand as the colouring of each map represents the ranking of that region for one of the survey questions. The survey found that the North East and Scotland are the “pothole plague” black spots of GB.
Edmund King provides several choice quotes in his introduction including
“Many councils have been swamped by the deluge of potholes, yet the evidence from the South West suggests the problem can be turned round. Although we are sympathetic with the plight that councils find themselves in austere times, the fact remains that we are seeing the legacy of a ‘Cinderella’ approach to road maintenance funding over many years”
Cllr Peter Box, Chairman of the Local Government Association’s Economy and Transport Board responded
“Parts of the country which have milder winters have less destruction wreaked on their roads by ice. Ranking geographical areas without taking this major factor into account displays a fundamental lack of understanding about road maintenance.”
Unfortunately the AA have not provided any detail as to how they calculated their rankings. Did they adjust their results for the the number of survey participants in each area, the distances and road types they covered, the road miles in each region, highways budgets, the level of utility streetworks in the preceding years (a significant factor in potholes) or, perhaps most importantly, regional patterns of severe winter weather? We have a number of maps but none of the underlying data and methodology that would enable us to evaluate their conclusions.
I am not suggesting that harnessing concerned citizens to survey the state of our roads could not be a useful contribution to the discussion about road maintenance practices, policies and budgets. But this type of citizen science has the potential to be skewed and misunderstood unless there is an opportunity for other “concerned citizens” to review the data and question the conclusions drawn by the lobby group sponsoring the survey.
Sounds like we need some #OpenData from the Streetwatch Survey to mash up with some #OpenWeatherData if the Met Office could oblige and some road length data which we can get from #OSOpenData(C).
Posted: November 18th, 2011 | Author: steven | Filed under: Open Data, Usability | 5 Comments »
Pinteresting, thanks to Dave77459 http://www.flickr.com/photos/dave77459/
This could become a habit and is certainly unlikely to win me many friends but here goes anyway.
This morning the Guardian published a map of road accidents and deaths over the last ten years produced by the clever folk at ito World who have produced some of the most stunning visualisations of transport and OSM data that I have seen. So what do you think of this?
At first sight it is just a mass of dots which do not indicate any spatial pattern. If I zoom into my area I am presented with a complex array of symbols that indicate for fatalities the type of victim(cyclist, pedestrian etc) by colour, the age of the victim, the sex and adult/child status, the year of the crash plus further symbols for serious and minor injuries. Wow, that is a lot of information in one map!
I am struggling to understand any trends or patterns in the data even when zoomed in to my local area. I would like to be able to filter by year, perhaps view some trend information, perhaps filter the different categories, maybe understand whether the data is average, better than average or worse (when rated against what I am not sure but I imagine a transport) and even view some more info on the accident (assuming that is available in the opendata). Bottom line is a mass of points even when elegantly and cleverly symbolised is not giving me any insight.
The Guardian have been great advocates for OpenData and have achieved some breakthroughs in opening up geodata, they have also been at the forefront of the new discipline of data driven journalism , now they need to demonstrate how OpenData can provide new insights into important issues like road safety. We need more than pointilism or as I have said before “Just because you can stick it on a map …” although in this case there is certainly a lot of insight that could be derived from a more analytical product.
Posted: November 9th, 2011 | Author: steven | Filed under: Open Data, Usability, Value | 1 Comment »
Getting your flood mapping wrong can have some serious consequences. Thx for the pic to Cheltenham Borough Council (yes a UK council with a flickr page)
Usually maps that inspire, delight, offend, aggravate, mislead or seem pointless get a short mention on The Good, the Bad and the Ugly but thanks to Mark Percival and Rolllo Home Flood Map gets a front page spread.
I don’t know much about the techniques of flood modelling but I have a feeling that it is a bit more complex than sucking up a 90m resolution free data set and draping it over Google. However you have to admire the author’s vision and ambition
“Development of the Flood Map application is a try to help fight against the natural disaster like flood and there by a try to save as many lives as possible.”
On the disclaimer side I think they have it well covered
Before using this Flood Map application, please note that the application may have some or many bugs or inaccuracies because of various technical or non technical reasons. Also note that there might be some miss-alignment, so please consider +-100 meters tolerance to be on safer side.
An application that wants to help fight natural disasters and save lives warns that it may have many bugs or inaccuracies, so why bother? Could this be a task best left to experts?
Posted: October 20th, 2011 | Author: steven | Filed under: Business, Derived Data, Local Government, Open Data, Ordnance Survey | Comments Off
Government is a big place, so it isn’t surprising that different departments can have conflicting agendas or not always be completely in the loop about what others are up to.
A couple of weeks ago an announcement came out of DCLG that a new wave of council openness was being hailed by Eric Pickles. A Code of Recommended Practice for council transparency is being published and “ministers are minded to make it legally binding”
“The code of practice calls on local authorities such as councils and fire and rescue services to shine a light on every part of their business, from employees’ salaries over £58,200 and details of all their contracts and tenders to details of grants to voluntary organisations, performance information and the locations of public land and building assets. It also establishes three key principles behind council transparency; timeliness, openness and mindfulness of local demand.”
I’m sure you don’t need me to point out the potential fun and games here as councils and OS discuss how open councils can be with data about their land and property holdings. No problem if you are an OS licensee under the PSMA but not so easy for the rest of us particularly the army of armchair auditors that Eric is relying upon to help highlight wastage in local government.
“Releasing this information to the public could provide a wealth of local knowledge and spark more improvements in the way services are delivered. Faster publication and easier access for the public and companies could open new possibilities for real-time analysis and response and opportunities for small businesses to enter new markets.”
Looks like we could have the mighty Mr Pickles and DCLG alongside Francis Maude and the Cabinet Office facing off the BIS and the new PDC.
This could be even more fun that Armando Ianucci’s hilarious fim.