View Full Version : Ordnance Survey releases map data free

04-01-2010, 07:09 AM
You really would think this was an April fools day thing but apparently it really isn't.

Ordnance Survey is the UK mapping agency originally for the ministry of defence but went more commercial providing maps for the UK to unprecedented detail and accuracy. They have been under pressure for many years to release their data free like the US gov agencies do being paid by the tax payer. So far we subsidize this company and they get crown copyright on all the maps and then we have to buy that data again when you want it. But today thats changing.


Though servers are down today due to excessive load, maintenance and that they are probably serving this from a ZX81 running Apache in basic... still it looks like when the heat is off this could be incredible. Since google can do it I think their data is a lot less valuable now. There has been a UK wide GPS users group collecting up maps of every road to the point that it is quite filled out as well. Oh and our gov has about 6 weeks left in office and there is no point in giving the next one any revenue streams...

Post codes are also going to be released free too which is hardly surprising since a) people have been hammering on about them for years as its a business critical database and b) they managed to accidentally release all the codes to coords a few months back in any case and they have been doing the torrents ever since.

So this might break open a huge quantity of dev work based on that data and is a really big deal.

04-01-2010, 03:04 PM
That's really really good news. Thanks for the heads up. Also, I'm guessing the military background is where the Ordnance in the name comes from? I'd never thought about that really. Interesting.

04-01-2010, 03:17 PM
Oh my bloody word! Redrobes....thank you for the info and have some rep....yum yum yum yum yum!

04-01-2010, 07:57 PM
I noticed that within a minuscule amount of time I can see google links to mirror sites but I wont believe all of this until I have some download on my disk. It still seems too good to be true. Were talking the gov here after all...

Ta for the rep but I am just the messenger really. Whether we can do something useful with the data remains to be seen. I have previously written an app to extract contour lines on maps and try to turn them into DEMs with limited success. Its obvious that the maps were printed with random noise on them to deter this exact purpose. Look at the MUCH improved print quality of maps from the 1970's compared with modern ones... Not exactly in line with printing technology ! I saw a web site once where some PhD guy was extolling his thesis and programs to do this with some images which had most convenient contour lines in them. I mailed and asked a few pertinent questions and got a slightly more honest account of its ability. Basically in the printed form its very hard indeed to get 3D DEM from arbitrary 1:50K or 1:25K map images scanned off of a map. But if they are handing the data out free then maybe there's no reason to do any "selective availability" type stuff on the data any more... maybe. Well we will see shortly I guess.

Still seems too good to be true...

04-01-2010, 08:39 PM
Ahh it seems the waters are indeed muddied... To be honest I don't think I understand the full implications of this yet but AFAICT it looks like its being opened up somewhat but in some limited and unspecified licensing manner. Which if you ask me is pretty much par for the course with our gov. Anyway, you read it and see what you think - then post :)


Sounds like post codes are not on the cards tho which gets a mneh out of me cos google can do them. Its being able to host or post OS maps that I want to think about. That and derive data from them to use and post. Personally I agree with the sentiment that there's more to be gained from tax out of new products than the current 20m profits at mo.

09-24-2013, 08:43 AM
I know this is a bit of a thread resurrection but I had a link from some place saying that they got some data from ordnance survey opendata pages. Well I just been on there....

OS OpenData products | Business and government | Ordnance Survey (http://www.ordnancesurvey.co.uk/business-and-government/products/opendata-products.html)

...and its most definitely live now. I think it might have been for some time but I can say this. I have downloaded the whole UK 50m DEM in ascii so the data is open and available now. I am going to spend an hour or two writing the decoder for it and will post a pic or two later on.

But WOW even after the download zip came through I still didn't think it really was the right data but it definitely is the whole UK at 50m spacing. Though you need to abide by the license, that license is quite open including commercial use. Its a lot like CC with share alike and attribution. Amazing work by the OS team to release this.

Edit: Oh yeah I recall where I saw it - you can get the whole UK on minecraft now. I thought for a while they might have had some kind of private deal with them to use the data so was pretty astounded when I realized it was open to all.

09-24-2013, 10:06 AM
Confirmed that its the real deal. DEM of the whole UK at 50m. Fantastic - I have been waiting for this for a long time !

Id add pics but the site still wont let me.

09-24-2013, 10:49 AM
Amazing, Thanks Redrobes!

This data should work with Fractal Terrains. I'll give it a go!

Ack it's in Shape, GML or ASCII Grid...no DEM file :(

09-24-2013, 11:52 AM
Its in ArcASCII which is a well standard format. What formats do you have the capability of reading ?

Edit: These are some pics of the data (Top of Wales) which I need to image host so they may not be here in the future...



09-25-2013, 10:09 AM
I'm using Fractal Terrains Pro which can read .bin files only (it appears) - maybe Waldronate can give further details? Those pictures look fab.

09-25-2013, 10:18 AM
Ah ok, well bin is somewhat arbitrary binary files so we need to know what format binary it can take. I was sure wilbur or something could read arc ascii. I can generate .BT, .HF2, a RAW which is also pretty arbitrary or just do a greyscale which looses a lot of info.

I have been coding up the script to run it against all the tiles they supply, then render out the height maps, then shade them using the MEDem procedural shader to get a view of the whole UK. Here is a pic or two - click for improved size. There are some irregularities near to sea level and this has no bathymetry data in it so its hard to shade against that. It seems Norfolk is below sea level in this data - I was not sure that was the case but maybe it is. It may also be that the sea level is not a constant at 0 and its using the Newlyn Datum which may mean that the sea is higher to the right of Britain - Not sure at this point...

http://www.viewing.ltd.uk/Temp/CG/OSDEM/ColShade1_mid_TN.png (http://www.viewing.ltd.uk/Temp/CG/OSDEM/ColShade1_mid.png)

09-25-2013, 10:42 AM
That looks great! I can import .BIL files from the USGS into FT Pro. I've just downloaded the ASCII format one but that looks like loads of separate folders containing zip files which I don't think that FT Pro can read in my limited experience.

09-25-2013, 03:11 PM
Hi, I don't realy understand all of this, but I would love to use some of this data to generate Landscapes for Arden. Is this possible with Wilbur? (I just started to learn it last weekend)

This all is looking fantastic.

09-25-2013, 06:48 PM
I had one zip containing loads of more zips under data folder. The other zips have the normal OS 2 letter code like SH etc which are the 100km square patches that cover the UK - (see https://en.wikipedia.org/wiki/Ordnance_Survey_National_Grid). So you can use 7zip to extract all the asc files in one go by using "7zip e *.zip *.asc" and it runs through them all extracting all asc files. Then you have something like 25 x 10 x 10 asc files of the uncompressed data in 10Km x 10Km tiles. Doing a quick google it seems QGis can open ArcAscii files ok. Maybe it will export to something FT can read. Each asc file also contails the lat long X & Y so maybe you can open many of them and build up a seamless expanse of DEM data.

Can you use these DEMs with other maps like Arden - sure, we use various DEM height data for MEDem mountains since its hard to get any app to generate proper realistic mountains. Apps can do a semi good job but up close in Outerra they look terrible. Theres loads of DEM sources you can use. Most people use the SRTM from NASA which is public domain and covers almost the whole globe. Then you can pick out Himalayas etc for good mountains. How you shade and colour your maps based on the DEMs is up to you tho. So search on this site for SRTM and there will be lists of sources to get at them - I think for fantasy maps these are better. But if your specifically after accurate UK dem data then this has up to this point been unavailable to an accuracy much better than the SRTM of the UK.

09-25-2013, 07:03 PM
Hi Redrobes, I am completely new to this and barely understand what you are telling me. I dont even know where to start. I saw some Middle earth Renders a while ago, but well, i need a starting point from where I can beginn to understand this more.

09-25-2013, 07:22 PM
Monks is a better person to ask as I run a lot of custom apps and tho the main make run of MeDem for Outerra is generated on my apps Monks models the whole place using off the shelf stuff. I would definitely look at Wilbur tho since a) its free and b) Waldronate knows everything there is to know about it. I only use HF2 format since it was custom designed by a group of us to handle large extended height fields but Wilbur loads in many types of formats and has a user interface whereas mine are all command line script type stuff which is all batched up and put into makefiles - it drives Monks crazy that I don't have a UI. My stuff is so hard to use occasionally I have to get the source code out and run it in debug mode just to see what the hell is going on and why its not outputting the right stuff. So yeah, Wilbur & QGis are good starts. You can find greyscale height maps of lots of stuff and you can download my extremely basic height map viewer from my web site - see sig or here (http://www.viewing.ltd.uk/cgi-bin/viewingdale.pl?category=dragons_flight). I used that for the pics in post #9 above.

In a way were all trying to work out the best way to do 3D terrain modelling since nobody has yet come up with a cast iron fool proof way to generate really realistic terrain that could fool the trained eye. So on MeDem Monks mainly uses Global Mapper but also Wilbur, and loads of other tools too. We restrict ourselves to wanting correct hydrology too which is also a tough call. We both think the right way to make an app is to use drainage basins and catchment areas and sculpt whilst the water is flowing but we need a bit of a jump in compute power - even with the GPU its still a bit off that at the mo.

09-26-2013, 07:39 PM
Just done another with procedural veg calculation. This is about as far as I can take it with this data.

http://www.viewing.ltd.uk/temp/cg/osdem/ColShade2_TN.png (http://www.viewing.ltd.uk/temp/cg/osdem/colshade2_mid.png)

09-26-2013, 11:03 PM
You can also push the sets through GDAL to generate a HF2 file that Wilbur should be able to read. Wilbur and FT can both handle arbitrary uncompressed binary rasters (that is, rectangular blocks of byte/int16/int32/ieee-754 float samples in big endian/litte endian order). The 32-bit FT version is pretty much limited to about 1GB or smaller data files, though.

Looking at the data, it's pretty obvious that there are large sections that were made from digitized contours (unless the landscape truly is composed of facets a few hundred to a few thousand meters across). It looks a lot like the USGS United States data from the 1970s, which may not be surprising as the data set is from roughly the same era as indicated on the web site. I'm wondering if it really offers much beyond the SRTM 90m data set for the same area except for a historical perspective.

The easiest way to check for faceting in data is to set the light to 90 degrees elevation and set a large vertical exagerration (100 or more) before doing hillshading. The artificial areas and lakes will usually stand out as much brighter than the areas with noise usually associated with electronic altitude gathering. As a bit of amusement, a USGS DEM of an area near my own local area was constructed from two different years of electronic data. One of those years was a wet year and one was a drought year. The largest reservoir in the area was split across the two data sets, making a 20 foot sloep in the middle of the lake from where the automated patching happened.

09-27-2013, 07:04 AM
If Wilbur can read HF2 then thats cool, I could convert some data for Ravs.

I am not seeing any faceting in this data tho. Cant understand where you get the idea this is from a vectored source. I think were looking at different data then. Mine all looks a lot like my pics in #9. The data is at 50m spacing so its nearly twice as good as the 90m USGS and since SRTM was 1km then its 20x as good as that. This is the best whole UK DEM source I know of tho I am sure there are localized much higher res LIDAR data around somewhere.

09-27-2013, 09:55 AM
Ah Brilliant, thanks Joe and Redrobes!



09-27-2013, 05:51 PM
It's possible that we're looking at different data sets. I downloaded the "OS Terrain 50" "ASCII Grid and GML (Grid)" data from https://www.ordnancesurvey.co.uk/opendatadownload/products.html and (approximately) followed Getting started with OS Terrain 50 elevation data (http://www.landscape-laboratory.org/2013/06/19/getting-started-with-os-terrain-50-elevation-data/) to convert the data to a single HF2 file. The images below are from that data set (you'll need to view at full res to see the effects because they are usually ten pixels or so across and the thumbnails have been downsampled during post creation).

Is an example of facets in the data, usually caused by interpolation of vectors or overly sparse data samples.

Shows some smooth areas and facets, along with the sort of noise that I would expect from electronic collection just ofshore in the south. Areas with rapidly-changing surfaces with respect to a radar or lidar will get this very high frequency noise because the detected maximum shifts from pule to pulse in the sensor. Trees, for example, may give maximum return from their upper canopy, mid-canopy, or even ground, depending on the geometry and type of sensor.

Shows a smooth set of mountains in between two other sets. It's quite possible that the terrain really does look like this, but I'm a little suspicious about radically different characteristics in a small span.

Is the same as the previous terrain set, but shaded according to the facing angle of the terrain without regard to slope. Higher-frequency elements will show up as more broken areas in this map. The mountains in the center are very smooth, which is quite possible if they are nice rounded terrain in the midst of more broken terrain.

SRTM data is available at 1km and 90m worldwide (30m in the US, to help foreign powers with flying their terrain-following cruise missiles right to high-value targets). GLCF: Shuttle Radar Topography Mission (http://glcf.umd.edu/data/srtm/) has the basic description of the SRTM data. The original sets had holes and didn't like steep mountains or flat waterways, but they have been reprocessed a number of times in the last 13 years and merged with other high-quality data sets to give much cleaner results.

Amusingly enough, OS Terrain 50 | Business and government | Ordnance Survey (http://www.ordnancesurvey.co.uk/business-and-government/products/terrain-50.html) describes the data as "Vector".

Please note that I'm saying that the data has no value. I'm just not sure what it offers over the SRTM product and derivatives if the data quality isn't validated. The map is no doubt 50 meters in places, but I can't tell by looking at the data which places are 50m and which are lower.

The 1970s to 1980s comment turns out to be an error on my part. I must have twitched and scrolled down half a page, because the Land-Form PANORAMA(R) entry right below the OS Terrain 50 data has that notation.

09-27-2013, 06:08 PM
As a silly side rant, SRTM data was collected in such a manner that the entire world from about 60N to 54S should be reproducable at (at least) 1 arc second (about 30 meter) resolution. Global Elevation Data (http://vterrain.org/Elevation/SRTM/) has a discussion of the beastie. It's purely a limited distribution politics on data that the US taxpayers have already paid to gather. I recall discussions from around the time of collection that said that it was at the request of "not the US" that distribution of the higher-resolution data is limited.

09-28-2013, 06:38 AM
Yes I see them in your pictures. I have been scanning around britain and I do see these kind of effects in very flat areas like "The Wash" or the low lands of East Anglia and especially as the land enters the sea but on the whole I rarely see these polygonal facets. I dont see any at the hilly or mountainous regions at all. I reckon its well probable that they were created from the contour lines interpolated so that where its flat you have sparse sampling.

I converted mine to a single HF2 file per coord square so I have 55 x 2000x2000 patches. Im still confused by the below sea level area in East Anglia too. Using a link here (http://www.osola.org.uk/elevations/index.htm) it says you can use the OS elevation data and the mouse clicks say 0m or 2m. But if I open the asc file then I get this sort of thing (specifically TL59.asc) "-1.4 -1.4 -1.4 -1.3 -1.3 -1.2 -1.1 -1.1 -1.1 -1 -1 -0.9 -0.9 -0.8 -0.8 -0.7" etc so its in the original data for sure. I can only think that there needs to be some kind of offset but I cant think why.

09-28-2013, 07:41 AM
I have been searching for why the fen land seems to be below sea level and then chanced upon this:
Places in England below sea level ? A natural history of Britain - (http://iberianature.com/britainnature/tag/places-in-england-below-sea-level/)

So its true - Peterborough is about 2.75m below sea level. Which means my map is about accurate then.

As part of the searching I came across the Ordnance Survey user guide to the 50m terrain data and it says that the data has been rasterized from vector data and that the vector data is not contour or grid representation but this TIN format so that it has triangles of spot heights that may run along power lines and so on. So its like contour data but not quite the same. So that clears that up. Why the hell its so bad near to the sea is a mystery tho but I guess with the variable tide and sea level readings and erosion and so on its hard to get an accurate absolute level. It also mentions the problem I discovered with the data where there are mismatches between the tiles for the mean sea level such that the whole tile sea is at a fixed height but two tiles don't have the same value. I had to put some special code into my converter to handle that. Its also true that where my land height is accurate based on mean sea level the coastal shape is not accurate compared to the real sea because of this offset cos the sea shader is always fixed height (1.5m above newlyn datum) in mine. So that clears a few things up for me. I think I can leave this alone for now until I need to use it for something.

09-28-2013, 08:23 AM
Just for a quick look back 200 Years ago. A Map of Great Britain`s Ordnance Survey from 1809 (Composite of) Devon Sheets 20 to 27 and title. - David Rumsey Historical Map Collection (http://www.davidrumsey.com/luna/servlet/detail/RUMSEY~8~1~254826~5519507:-Composite-of--Devon-Sheets-20-to-2?sort=Pub_List_No_InitialSort)