Free data a GFC casualty

The US government has been a proponent of free data for quite a while now and over the years it established a number of national programs to allow easy access to wast resources of public information. However, the annual budgets for e-government initiatives were slashed by 75% last month, putting in question the survival of such programs like data.gov (it is the repository for publicly available data that was promised as a platform to power software and analysis created by and for the public). Comments from federal CIO Vivek Kundra indicate that data.gov will not be shut down but “…there will be no enhancements or other development to address needs for improvement”. So, although the policy of free data remains unchanged, significant cost of delivering that policy may be its ultimate “undoing”.

Meantime, in Australia, the progress towards opening up government data vaults has taken another step forward. Earlier this week Australia’s Information Commissioner, John McMillan, unveiled eight new rules for Federal agencies to adhere to when considering the publication of government data. These rules are:

  • Open access to information – a default position,
  • Engaging the community,
  • Effective information governance,
  • Robust information asset management,
  • Discoverable and useable information,
  • Clear reuse rights,
  • Appropriate charging for access, [So, not entirely free access!]
  • Transparent enquiry and complaints processes

The Principles are not binding on agencies, and operate alongside legal requirements about information management that are spelt out in the FOI Act, Privacy Act 1988, Archives Act 1983 and other legislation and the general law.

Despite the launch of data.gov.au portal, there is no federal program in Australia to facilitate access to public data on a large scale (ie. the US style) and the onus so far is on individual agencies to manage the dissemination of public information in their possession. State and Territory governments are pursuing their own initiatives. This “piecemeal approach”, although slower in implementation, may prove to be a more sustainable model for enabling access to public data, considering the vulnerability of large scale initiatives to budgetary pressures of the government of the day in these uncertain times.

Speaking of OpenStreetMap

Last month, it was announced that OpenStreetMap would be getting its hands on Microsoft’s aerial imagery. (One way to make maps in OSM is to draw on top of aerial imagery. Yahoo’s imagery has been made available for that purpose, but it’s is incomplete and a bit dated.) The new Bing imagery is now available through Potlatch 2 — sooner than I expected. And Potlatch 2 is now available through the regular OpenStreetMap site: you have to hover over the “Edit” tab to select it from a pull-down menu.

I’ve spent the last two days playing with both — i.e., creating maps with Bing aerial imagery using Potlatch 2. While there are places where the Microsoft imagery does not offer any advantage over Yahoo’s, there are lots of places where Bing’s imagery is more recent and at higher resolution. And there are lots of places that now have imagery that is worth tracing over that did not before. Locations that were basically unmapped in OSM now have one less excuse. (To be sure, many places with good imagery had no maps as well, but that’s a differet issue — volunteers, government data import, whatever.)

Potlatch 2 is noticeably improved over the public alpha. There are definitely some areas in which it’s an order of magnitude better than the original Potlatch, but there are still some gaps — some tags I use a lot are now a lot harder to find, and line directions are invisible, so far as I can tell, on things other than one-way streets (think rivers). I don’t think it’s any less stable, so on balance I prefer using it to its predecessor.