Posted: January 11th, 2012 | Author: steven | Filed under: Business, geo.me, Google, OSM, Value | 5 Comments »
Free, thanks to Brad Stabler http://www.flickr.com/photos/bstabler/
Over the holidays there was a flurry of excitement, particularly among OpenStreetMap fans, prompted by Ed Freyfogle’s announcement that Nestoria were switching from using the Google Maps API to using an OpenStreetMap tile service provided by MapQuest. The switch was prompted by Google’s announcement in April of last year that they would be introducing some volume limits on the usage of the free Maps API:
“We are also introducing transaction limits on the number of maps that may be generated per day by commercial Maps API web sites or applications. These limits are documented in the Maps API FAQ and will take effect on October 1st 2011. Commercial sites that exceed these limits may be required to purchase additional transactions or a Maps API Premier license…
…Not for profit applications and applications deemed in the public interest (as determined by Google at its discretion) are not subject to these limits.” (My italics)
Nestoria have been a long term supporter of OSM and as Ed says the introduction of the new charging regime was the trigger to move away from Google:
“Having always envisioned that we would someday move to OSM, this was the nudge that pushed us over the cliff.”
The new limit is 25,000 map sessions per day averaged over a quarter, which equates to 9m sessions per year. Google expect 0.35% of the sites using the free Maps API to be impacted which I reckon is between 1200 and 1400 sites. Big respect to the Nestoria guys for growing their business to this level but perhaps a note of concern for the many passionate advocates of geo (myself included) on the economics of incorporating map services in applications where revenue per page is on the low end of the spectrum.
I don’t know why Google chose to introduce these limits, it could be about the costs of delivering the service (hardware, bandwidth/power or data licenses where Google is using 3rd parties) or it could be a drive by the commercial team to increase sales of Enterprise licenses. Probably a bit of both. It sounds like the commercial approach may have been a bit less than optimal but no doubt Google will adjust their approach (and possibly their pricing) if they want to retain some of these large usage sites or we may see the more aggressive introduction of context sensitive advertising directly into the map although we have heard little more about the trials of map ads in Australia.
OSM advocates have been understandably chuffed that Nestoria considered OpenStreetMap to be as good or better than that offered by Google and it’s data providers, no doubt OSM has come of age as a viable royalty free source of geodata. In my opinion the focus on open vs traditional geodata ignores the broader scope of Google’s Map API which offers quite a bit more than just a map tile service – geocoding, directions, Streetview, local search and perhaps most important colossal scalability. What interests me about Nestoria’s decision is the choice of MapQuest rather than going with a self hosted tile service based on OSM which they explained:
“When we realized it was time for us to make the move we faced one big decision – should we use someone else’s OSM tiles or should we render and serve our own? We called in an expert to advise us. OSM expert, and former Nestoria blog interviewee, Andy Allan … Rendering has the advantage that you can make the map look exactly the way you want. When done well this can produce phenomenal results … but unfortunately it’s no small technical undertaking, especially when we’ve also got a property search engine to run.
We concluded the only viable path was for us to leave the rendering and serving to experts and use someone else’s OSM tileset… Luckily however several companies have stepped in to fill this gap -CloudMade has for several years offered an OSM tile layer for all to use. In 2010 MapQuest released a similar service. While we are longtime fans of CloudMade (we use their tiles on our Where Can I Live? service), for their global infrastrucutre and speed we decided we’d prefer to use MapQuest’s OSM tiles.
What’s in this for MapQuest? I imagine that the publicity and goodwill are worth quite a lot to them (but that was what many people thought was the motivation behind Google’s free API for the last 6 years). Presumably the cost and complexity that Nestoria identified are shared across a large user base for the tile service but as uptake grows the costs of servers and bandwidth must become a factor in the economics of their free tile service. I wonder how long it will be before the bosses at AOL (ultimately the very commercially astute Arrianna Huffington) start looking to monetise the usage of their infrastructure? Will it be “in map advertising”, alongside the map advertising or is their some other model? At least if you have architected your service in the way that Nestoria have, you can quite easily switch from the MapQuest service to someone else’s if the terms change in the future.
I can imagine a scenario where a number of the 1200 odd commercial sites switch to MapQuest or Cloudmade only to find that these guys with much shallower pockets than Google will struggle to support their levels of usage and functional needs (routing requires a quite a lot of processing for example) on a free model. Of course they can then start to roll their own map services with something like MapBox or building their own tile service and hosting it in the cloud but then they will start to incur the direct costs plus the overheads of supporting their service. The guys from MapBox promoted their service as an alternative on twitter:
“With our new add-on packages, you can easily bump up your storage + bandwidth ds.io/A12Drf And it’s 12x cheaper than Google Maps”
When I queried this they quoted their cost per thousand extra map views at $0.32 compared to Google’s $4.00 but I think that ignores the Google’s free service offering 2.25 maps per quarter before you hit the charge or the need to have quite a lot of technical skill and understanding to set up a tile service. That said MapBox is an interesting open source based service that woud appear to offer a low cost solution to serving tiles with the benefit that you can style those tiles to your own designs. I’d be interested to understand the economics of this kind of business model, presumably over time it will get driven down to pretty low prices but will never be free. Look at web hosting, prices for a small site ca be as low as £2/month and you can get quite a powerful server from Amazon for about £35/month but as far as I know no one is offering large chunks of infrastructure and bandwidth for nothing.
So is this the beginning of the end of Free? Not quite yet, but maybe it’s time to recognise that a free ride may not last for ever and perhaps we need to think a bit about what price we are willing to pay for a map service. This comment on the Nestoria blog from another business affected by the change in Google’s usage terms summed it up for me:
“In our case, that meant paying $200,000-$400,000 a year for maps…
Google got their pricing off by at least an order of magnitude. I don’t know many companies making more than a couple of dollars per thousand views. Had they charged us $0.20 per 1,000, we would have gladly paid.
In the end pricing is a trade off between the value to the purchaser and the costs of production/delivery, even competitive pressures cannot reduce long term costs to zero unless the providers find a different revenue model, which could bring us back to “in map adverts” and I doubt many large commercial users of maps would want to be potentially hosting adverts for their competitors in return for free or very low cost maps.
I wonder whether within a couple of years a modest fee say $250-500 per million maps will become the going rate for a high availability map service with some additional costs for directions and other features. Time will tell.
Posted: September 16th, 2011 | Author: steven | Filed under: Business, geo.me, Google, Open Source, Value | 2 Comments »
Rock and a hard place - thanks to Shemp 65 http://www.flickr.com/photos/shemp65/
On Tuesday I went to the Google Geospatial Summit at the Science Museum with the guys from geo.me who had a booth at the event.
It's a long way down there from the rear stalls, thanks to Stuart Grant of www.geo.me for the pic
The event was pretty plush with the background of the Science Museum and the auditorium was the IMax cinema which was impressive, particularly if you were sitting high up!
The main focus of the event was to launch Google Earth Builder in the UK. GEB follows on from Google Earth Enterprise to offer organisations with large volumes of imagery and vector data the oportunity to upload their data to the Google cloud and then manage secured or public access via the Maps API, the Earth desktop application or browser plugin or the mobile versions of those tools.
At the moment the functionality seems to be limited to viewing data but as a scalable distribution platform for corporate spatial data delivered through highly familiar and intuitive interfaces I can see how this would appeal to organisations with massive volumes of potential occasional users. I have been saying for a while that Google could pose a difficult challenge to the established GI tech companies because of the incursion into their market of it’s free products. Now Google are offering massive cloud infrastructure and scalability in a reasonably secure environment (the US National Geospatial-Intelligence Agency are apparently among the first customers). If the pricing is reasonable, GEB has to have the potential to eat into some of the lighter implementations of your favourite GIS vendor.
I was sharing some thoughts with a few folk at the event and on the twittersphere and one friend who works at a big GI company commented
“..rumours of the untimely demise of GI giants are greatly exaggerated.. ”
I agree. Buuuuut and it is a big but, Google’s push into the enterprise market must prompt some changes from the existing players – commercial models, ease of use, performance, scalability and innovation to name a few.
I was struck by the size and composition of the audience at the event. There were industry people from ESRI, Intergraph, bing, PBBI MapInfo, Ordnance Survey, Blom plus a load of consultants. On the customer side, there were delegates from local and central government plus a range of corporates some of whom I remember as clients from my MapInfo days and others who we never got close to. The “Customer Success” panel (see pic above) featured Shell talking about the app that geo.me developed for them (got to get the plug in), Rightmove who have some pretty neat functionality in their app, the Space Reconnaissance Center of the UAE Armed Forces who already use GEB and the Met talking about their public crime mapping application. Trevor Adams of the Met in response to a question about usability said (not verbatim)
“I can do this at home, why can’t I do it at work?”
Which for me sums up the challenge to some of the plugin dependent, GI centric applications that many of us have grown up with. Who after all uses ArcGIS or MapInfo Pro as their tool of choice at home?
I recall a conversation with the boss of a big GI tech vendor who while recognising the encroachment of new entrants into the consumer space and simple visualisation in the professional space maintained that his company’s strength was their dominance in the “heavy lifting”. I also recall several presentations from Ed Parsons in the past where he reassured the audience that they would still need traditional GI tools for the “heavy lifting”. Now “heavy lifting” may mean different things to different people but I lost count of the number of times that the Google presenters talked about “leaving the heavy lifting to Google” or “stand on our shoulders”. There is a definite change of tone from Google and based on the steely grins on the faces of some of the industry people attending, I think they recognised it.
When I got home I glanced at the feed from the FOSS4G conference. I have written quite a bit about the opportunity for Open Source recently and how adoption is growing within the public sector. Webmap servers and spatial databases are becoming commodities, the ecosystem around OSGeo is evolving and it will become increasingly difficult to make a good case for paying license fees to proprietary vendors for technology components that are robust, proven and free (and yes I know that open source is not free, neither is proprietary after you have paid the initial license fee). I haven’t used a desktop GIS for ages (fortunately) but I was massively impressed by the capabilities of QGIS which is not only free but also runs on my Mac which not much other GI software does. No way is this a replacement for MI Pro let alone ArcInfo but it certainly will satisfy users who underutilise the massive functionality of those products.
Open Source on the left and Google on the right (or vice versa depending on your politics) and you might think that our old favourites are between a rock and a hard place.
Posted: July 27th, 2011 | Author: steven | Filed under: geo.me, Location Social stuff | 1 Comment »
Thanks to agoasi http://www.flickr.com/photos/bartholl/
In his recent post on geotags in twitter Thierry Gregorius said
“If you use Twitter then you will have noticed that the service encourages people to geocode their tweets, that is, to record their physical location at the moment of tweeting. What particular purpose this may serve is another point altogether, but let’s not get into that.”
Well I do want to “get into that”
With so called smartphones it is easy to add an x,y to almost everything, photos, tweets, checkins, blogposts. For many of us it is easier to leave geotagging on rather than dive into the deep settings to switch on/off selectively – Warning extensive use of GPS may run down your battery.
I have been wondering what is the point of geotagging tweets? Some of you may have seen some early thoughts on this from last year’s W3G Conference/Unconference and nearly a year on I have yet to find any convincing uses of geotagged tweets.
Several academics have looked into the topic and to the best of my knowledge no one has come up with any meaningful relationships between the content of a tweet and the location from which it is tweeted. It seems to me that the best that we can get from geotagged tweets is the locations and times of activity of people with mobile devices who use twitter, probably corresponds to a digi/socialmedia/techy demographic and largely urban.
This demo from geo.me let’s you search for any hashtag or term on twitter (select tweet mapping) and this one from the last election shows political boundaries so you can see if you can find any political trends in the tweets that your search returns. Maybe someone will spot a significant correlation.
My friends at GeoIQ have done some pretty awesome stuff using a sentiment analysis engine on the twitter feed from this year’s Oscars to visualise the content, see Sean Gorman’s blog post and the visualisation. Sean acknowledges some of the limitations of the analysis
A second challenge with location based sentiment analysis is how meaningful are the results. I think one of the things we miss are margin of error calculations for sentiment analysis. Once we’ve aggregated data we have a sample size for that geography that we can calculate a margine of error against.
This is the best that I have seen so far, but does it really provide much in the way of insight? I am not sure.
Anyway for the moment the “locate my tweets” feature is switched off, it’s enough that I bombard you with my opinions without sharing where they originated from. Just in case you had forgotten I also still have some reservations about the privacy of broadcasting my location, show me my gain and I might change that view though.
Posted: July 1st, 2011 | Author: steven | Filed under: Crowd, geo.me, Google, OSM | 1 Comment »
A couple of weeks back I was invited to talk to the graduate class at the Arup University. Arup offer a great study program for their staff that leads to an MSc in GIS, this was the final part of the program when the students come from around the world for a final week of workshops and talks. I thought I would be talking to about 10-15 students, in the event Ewan had publicised the talk within Arup and there were about 70 people in the room plus 4 offices videoconferenced in from around the world. These were GI techy folk, in the language that we are not using any more this was deep paleo territory, it was good that I had prepared a bullet free slide deck with some video to snazz it up.
3 years ago I was on their side of the fence and rewatching this presentation made me think about how different life would have been if I had not had the resources and opportunity to leave MapInfo when I did. Most of the stuff I talked about will be well known by many readers but 3 years ago it either didn’t exist or was barely heard of, certainly the majority of people working in a mainstream GI company were not thinking about this stuff and did not see it as impinging on their business or careers – I bet they do now.
The guys at Arup had really ace AV for my talk, so if you are interested, sit back and enjoy the show (the Q&A at the end wasn’t miked up very well but you can work out the questions from my answers). There are bigups in here for OpenStreetMap, itoWorld, Ushahidi, the Copenhagen Bicycle, Waze, Google (of course) and a littleup for geo.me
Thinking outside of the bounding box from steven feldman on Vimeo.
Isn’t Vimeo wonderful? Doesn’t even blink at a 45 minute video while youtube lets you upload the whole lot before telling you it is too long! #bloodyfail
Thanks to Ewan Peters at Arup for inviting me.
Posted: April 15th, 2011 | Author: steven | Filed under: geo.me, Google, Local Government, Open Data, Ordnance Survey | 1 Comment »
I could have inserted an "F" word here. Thanks to Cudmore http://www.flickr.com/photos/cudmore/
Last year Geo.me were in discussions with a potential partner in Local Government and the dreaded derived data question came up, something along the lines of “we’d love to work with you but …”
There still seems to be quite a lot of confusion about what data Local Government (or for that matter any public sector user of Ordnance Survey base maps) can or cannot publish using the Google Maps API. I thought that I would try to produce a simple outline of what people could and couldn’t do. Given that OS had published some new guidelines on derived data I foolishly thought it would be a fairly simple task.
After several iterations (shared with OS and Google) and some helpful input from the licensing and legal folk at OS I finally handed over a finished version to Geo.me who have published it here. I must stress that this is my view of how PSMA members can publish data using Google Maps, OS have not officially sanctioned this view although I don’t believe that they materially disagree with it (watch the comments fill up on this post). Hopefully my version is a little easier to read and understand than this FAQ on the PSMA web site.
It is disappointing that we still cannot freely publish all public sector corporate geodata on top of Google Maps as part of routine business activity in the public sector. It costs the public sector and tax payers dearly in terms of usability, software license fees and infrastructure costs. Why does this continue to be a problem (particularly as OS are keen to point out that publishing on Bing Maps is OK, shame the API is less popular than Google’s and offers less usage for free)? Having spent over 3 months of wrangling, discussing etc, it seems to me that there is a lot of lawyer facing off going on here and somehow common sense is being suspended. I really don’t think Google wants to appropriate any OS IPR but OS lawyers remained stressed about their interpretation of Google’s T&C’s.
In their FAQ OS say
We have sought official clarification from Google on these points, and suggested alternative drafting that would resolve the issue from our perspective whilst, we hope, satisfying Google’s need to develop their service unencumbered for the benefit of their users. We understand that these proposals are receiving active consideration from their lawyers and we are hopeful that our recent positive engagement and experience with Google will result in mutually agreeable terms being adopted.
My response would be “Please get a move on”. There is business to be done, tax payers’ money to be saved and better public facing mapping experiences all waiting on a full resolution.
In the meantime there are loads of local government data sets that can be published using either the free or premier Maps API’s. Far be it for me to say to anyone interested in this topic “JFDI!”
Geo.me and Google are running an event for Local Government in June, you can register here. I imagine I will have a word or two to say on the subject
Posted: May 20th, 2010 | Author: steven | Filed under: geo.me, Google, Open Data, OSM | Tags: geo.me | 5 Comments »
Who says you cannot mix football and geo? Certainly not Alun Jones and the team at GeoInformation Group. Today the guys behind the UKMap hosted a day of geostuff at the Emirates, yes that right geobabble at the home of my beloved Arsenal. How good is that? I will tell you – it was exceedingly good Mr Kipling.
It was very strange to come out of the Arsenal tube station (others please note we are the only football club to have a station named after them, there is something about “place” in that but not for today) at 8.30 in the morning and walk up to the ground with no one else around. Usually there are at least 50,000 others there.
There were nearly 200 people in attendance for a day of presentations from the varied sponsors (look at the GI website for more details). Big respect to GI Group for putting these events on all round the country for free and also to the sponsors/presenters for not going down the yawny sales push route. OK polite stuff over ……… now the georant!
This was a conference of largely public sector (aka Local Govt) users listening and long term industry people presenting. Just about every presenter had to get in several references to Google, at least one mention of Bing, a slide with the OpenStreetMap logo and a map screenshot, a picture of an iPhone, a mention of the power of the crowd or was it the cloud, a bit of StreetView and liberal use of the words open and free. I am sure you get my drift. Now you might think that is a good thing, and so would I, but I got the sense that most presenters were in the “oh shit, look what’s coming!” mode rather than having any ideas about how they were going to embrace these changes and create new value for their clients (or are they soon to be ex-clients?). Without being perjorative, oh heck why not stir it up a bit – you don’t become cool just by quoting the names of the companies that could be eating your lunch in the next few years. Hence my reference to blinded by the light (startled rabbits and headlights in case you are still not with me).
(c) Michael J Hunter http://www.michaeljhunter.com/
Now you might think that was an uncharacteristically acid rant from me. Truth is I was pissed off. There I was to do my AGI Foresight summary – GI in 2015. I had savagely edited a 40 minute talk down to 20 minutes and long before my slot came up, just before the end, every point that I had wanted to make and all of the opens, frees, GYM’s etc had been flogged to death and no chance to reshape the deck.
On the positive side my message that there is a perfect storm blowing through geo brought about by the conjunction of social, economic and technological change and that geobusinesses and geoprofessionals need to adapt or die had been well and truly hammered home.
So my summary of the challenges facing UK Geo was:
“So what are the key challenges for the UK GeoCommunity? The foresight report offers quite a long list, these are my personal choices:
- Discarding the location-specific baggage and enter the mainstream
- Building a skill set that enables us to provide context and understanding as geo goes more and more mainstream, we will need people who can answer the questions “so what does that mean then?” or “where in my business process can I improve outcomes through the application of location and how?”
- Finding ways of ensuring consumers can manage and understand the issues surrounding location privacy
- Finding business models that respond to the challenge of free
- New entrants will seize the opportunity to fuse the geoweb and social media – established incumbents will need to react or die;
- Developing the role of location information in socially significant applications, such as participatory democracy, mega city planning as well as consumer applications;
- And last but not least finding ways to communicate with the 50% of end users who don’t understand map”
And you can read through the whole slide deck and notes here
A great event GeoInformation, thanks very much for inviting me to speak. Got to catch up with a load of old friends and managed to show geo.me to quite a few people with very encouraging responses. Feels as if geo.me may be a right time and right place adventure, but more of that later.
The back channel was fairly quiet at this event, not a lot of geotwitterati in attendance but there are a few particularly good ones from Martin Daly which you can scan through here
Posted: May 4th, 2010 | Author: steven | Filed under: geo.me, Google, Ordnance Survey, OSM | Tags: geo.me | Comments Off
Over the last two years I have become convinced that there is a big opportunity to build a professional quality business on top of the API’s provided by Google, Bing and of course OpenStreetMap to serve mainstream clients in public sector and commercial markets who want more than a quick mashup. Sort of neo stuff for some more traditional clients.
So amidst all of the networking, consulting, conferences and other activities I have been on the lookout for a geostartup that had a solid technical capability and a realistic business model. Yes you can probably guess I believe I have found just such an opportunity. I have joined geo.me Solutions as their chairman and an investor.
geo.me have a very neat platform that makes it simple to publish client’s content on any of the major mapping services and leverage the unique capabilities of each service, I have dubbed it a geoContent Management System (have to see whether geoCMS catches on). They offer some very elegant interfaces and ways of visualising data which are particularly interesting as all of this free and open data becomes available and the challenges of “derived data” diminish (hopefully).
It’s fun to be working with a geostartup and I cannot think of a better time to be launching a cloud based geo-business – exciting times ahead.
As a taster have a look at this UK General Election Map which shows geotagged tweets with the hashtag #ge2010 or any of the other trending election tags which are near to you, it also has the latest OS OpenData constituency boundaries and some feeds from the Guardian’s Politics API. Not finished yet and probably never will be but it will interesting to see whether the twitterati are in any way indicative of the broader electorate as the results start to come in on Thursday night.