Oct 222014

In March 2014 Ordnance Survey (OS) published an alpha release of the much anticipated Building Height Attribute (BHA) dataset, which is an enhancement to OS MasterMap Topography Layer. You can read all about it in their blog post. In this blog we’re going to show you how to integrate the BHA dataset with buildings in the OS MasterMap Topography Layer to create a heighted buildings dataset and visualise it in 3D. We used ArcGIS 10.2 and ArcGlobe to do this but other software could be used.

The first alpha release of BHA included buildings covering approximately 8,000km2 of the country. A second alpha release of BHA was published in July 2014 which covers around 10,000km2 of the major towns and cities in Great Britain. OS publish an interactive map which shows the extents of the areas covered by the alpha release, so you can check if your area of interest is included.

A note of caution, this is an alpha release of the data and OS do not guarantee that BHA is error free or accurate. Additionally the dataset is not subject to update and maintenance. However in time OS intend to include BHA in OS MasterMap Topography Layer so in future it will be supplied and maintained as a part of the Topography Layer.

Attributes supplied by OS

BHA attributesA number of attributes are provided for each building as shown in the image :

  • ground level [AbsHMin]
  • the base of the roof [AbsH2]
  • highest part of the roof [AbsHMax]

Using these three values two additional relative heights are calculated:

  • relative height from ground level to the highest part of the roof [RelHMax]
  • relative height from ground level to base of the roof [RelH2]

Data availability

OS publish the data as a single CSV file containing over 20 million records. This is a very large dataset and can cause data management problems in a desktop environment so EDINA have split the dataset up using the OS 5km grid allowing you to download the data in tiles for your study area. The data is available in CSV and KML formats. To use the data in GIS or CAD packages you should download the data in CSV format; KML is used to visualise the data in Google Earth.

OS 5km gridThe ‘Show Grid/Overlay’ menu on the right hand side in the Data Download application displays the OS 5km grid. This will draw a grid with each square containing the OS 5km tile reference, as shown in the image.

Please note: BHA data is not currently available for the whole country, you should consult the interactive map published by the OS to see if data exists for your area of interest.

Using the data

OS provide an excellent Getting Started Guide which explains in detail the process of getting BHA data in to GIS for subsequent analysis. The main steps are described below but please refer to the Getting Started Guide for full details.

The data is supplied as CSV files. Each record in the file has a unique TOID which can be used to join the data to building features in OS MasterMap Topography Layer.

Getting started
  1. Download OS MasterMap Topography Layer data for your area of interest from Digimap using the OS Data Download application. Select the ‘File Geodatabase’ format for your data as this is a native ArcGIS format and doesn’t require any conversion.
  2. Download BHA data for your area of interest from Digimap using the OS Data Download application (BHA data is found in the ‘OS MasterMap’ group), selecting CSV format.
  3. Open the OS MasterMap Topography Layer data in ArcGIS.
Preparing BHA data for use

If your downloaded BHA data is made up of more than one CSV file we recommend merging them all together in to a single CSV file first to make subsequent processing easier and quicker. Use a text editor such as Notepad or TextPad rather than Excel, as Excel can change the formatting of numbers which contain leading zeros.

Each object in MasterMap Topography Layer have a unique identifier called a Topographic Identifier, or TOID for short. TOIDs supplied by Ordnance Survey take the format of a 13 or 16 digit number prefixed with ‘osgb’ e.g. ‘osgb1000039581300′ or ‘osgb1000002489201973′. ArcGIS automatically strips off the ‘osgb’ prefix and adds three leading zeros to any TOID that has only 13 digits to make them all 16 characters long. In order to make it easier to join BHA data to building features in MasterMap Topography Layer the BHA files supplied by EDINA have two TOID values:

  • os_topo_toid_digimap is the TOID formatted to match TOIDs in ArcGIS
  • os_topo_toid is the original TOID as supplied by Ordnance Survey (this should be used in other GIS packages such as QGIS which do not modify the TOIDs in MasterMap Topography Layer)

Before BHA data can be loaded in to ArcGIS it is necessary to create a small text file (called schema.ini) that specifies the data type of each field so that ArcGIS handles it correctly. Specifically the schema.ini file is used to ensure that ArcGIS treats the two TOID  values as text rather than numbers. The steps required are detailed below:

  1. Create a new file called schema.ini in the same folder as the BHA csv file you wish to import.
  2. Open the file in a text editor such as Notepad or Text pad.
  3. Copy and paste the following text in to the file:
    Col2=OS_TOPO_TOID Text
    Col4=BHA_ProcessDate DateTime
    Col5=TileRef Text
    Col6=AbsHMin Double
    Col7=AbsH2 Double
    Col8=AbsHMax Double
    Col9=RelH2 Double
    Col10=RelHmax Double
    Col11=BHA_Conf Long
  4. The first section of code, in square brackets shown in red above, refers to the name of the csv file you wish to import. You should modify this filename so that it references your BHA csv file.
  5. Save your changes to the file. Ensure it is called schema.ini and is saved in the same folder as the csv file you with to import.
  6. Add your BHA csv file to ArcGIS through the Add Data function; this will add the data as a table in the map document.
Creating a heighted buildings dataset

ArcGIS JOIN windowIn order to create a new heighted buildings dataset from the building features in OS MasterMap Topography Layer and the BHA data we use the GIS ‘join’ function. A join links these two datasets together through a common unique identifier (the TOID) resulting in a set of buildings with height values stored as additional attributes.

  1. Right click on the Topographic Area layer in the table of contents > Joins and Relates > Join. This will bring up the Join Data window which can be completed as shown. Remember to join to the TOID in the csv file that is formatted to match the TOIDs displayed in ArcGIS (os_topo_toid_digimap).
    Tip: to create a dataset which just includes the heighted buildings select ‘Keep only matching records’.
  2. Having joined the datasets together we can then export the result as a new Feature Class in our File Geodatabase for subsequent use and analysis. This is done by right clicking on the Topographic Area layer in the table of contents > Data > Export Data…  give your new dataset a suitable name and select your existing File Geodatabase as the destination.
Visualising the result in ArcGlobe

So far we have downloaded data from OS MasterMap Topography Layer and BHA data for the same area and joined the two together to create a new dataset containing just the building features which now include the various height attributes published by OS. Now the fun begins!

We can easily visualise the heighted buildings dataset in 3D using ArcGlobe or ArcScene. The following steps describe how to import the data in to ArcGlobe.

  1. Download the OS Terrain 50 DTM for your area of interest from Digimap using the OS Data Download application. This will be used as the base (ground) heights for the area to provide a more accurate terrain model than is available by default in ArcGlobe.
  2. Open ArcGlobe and add in the DTM. You will be asked if you wish to use the DTM as and ‘image source’ or an ‘elevation source’. You should select the ‘elevation source’ option:

ArcGlobe add DEM window

  1. The Geographic Coordinate Systems Warning dialog will appear as OS MasterMap Topography Layer data is in a different coordinate system (British National Grid) from that used by ArcGlobe (WGS 84):

ArcGlobe Geographic Coordinate Systems Warning

  1. You should specify the transformation used to ensure that the data is accurately positioned on the globe. Using the Transformations… button you should specify the ‘OSGB_1936_To_WGS_1984_Petroleum’ transformation:

ArcGlobe Geographic Coordinate System Transformation

  1. Adding your heighted building dataset from your File Geodatabase is achieved through the Add Data button. Once added you may need to zoom to the layer to view it: right click on the layer in the table of contents > Zoom To Layer.
  2. By default the data is not extruded vertically so appears flat on the earth’s surface. To visualise the buildings in 3D right click on the layer in the table of contents and select Properties and then click on the Globe Extrusion tab.
  3. Select the ‘Extrude features in layer’ checkbox and then in the ‘extrusion value or expression’ box enter the following:
[relh2] * 1.5

ArcGlobe layer properties

This will extrude the buildings using the RelH2 attribute with a vertical exaggeration of 1.5 times (i.e. buildings will be shown 1.5 times their actual height). We found using RelH2 (the relative height from ground level to base of the roof) provides a more useful visualisation over RelHMax (the relative height from ground level to the highest part of the roof) which can lead to some overly tall looking buildings where they include towers that extend significantly beyond the height of the rest of the roof.

The end result

The image below shows an area of Edinburgh including Edinburgh Castle with Arthurs’ Seat in the background. Aerial imagery from ArcGlobe is draped over OS Terrain 50 data for the region with heighted buildings drawn on top. Using the tools in ArcGlobe it is easy to explore the landscape, navigating across the surface and examining the relationships between buildings in the built environment.

BHA data in ArcGlobe

Further information

OS published Release Notes for the alpha releases of BHA. Additional information can be found in Annexe D of the OS MasterMap Topography Layer User Guide and Annexe E of the OS MasterMap Topography Layer Technical Specification.


Sep 252014

Even though we update our mapping data throughout the year we still make a large number of changes over the summer.

Along with the general updates to MasterMap and a few other products the biggest change over the summer has been to swap the 1:10 000 Scale Raster maps to VectorMap Local Raster maps in Digimap Roam. Ordnance Survey have retired the 1:10 000 data, so we have removed it from the online maps, though you do still have a couple of options if you are a fan of the product:

  1. You can view the VectorMap Local data styled to look like the 1:10 000 maps at two zoom levels in Roam; remember that on the Map Content tab you can remove the contours for a very close representation of the mapping.
  2. We still make the 1:10 000 data available in Digimap’s Data Download interface, and will continue to do so for at least this academic year.

The table below details all of the latest changes we have made showing the currency dates of all Ordnance Survey datasets in Digimap.

Digimap Roam

Ordnance Survey product Data Currency Last updated in Roam
OS MasterMap® Topography Layer May 2014 July 2014
OS VectorMap® Local July 2014 September 2014
OS VectorMap® Local Raster July 2014 September 2014
1:25 000 Scale Colour Raster March 2014 March 2014
1:50 000 Scale Colour Raster June 2014 July 2014
Meridian™ 2 January 2014 March 2014
OS Terrain™ 50 July 2013 October 2013
OS VectorMap® District Raster September 2014 September 2014
Strategi® January 2014 March 2014
1:250 000 Scale Colour Raster June 2014 September 2014
MiniScale® January 2014 February 2014

Data Download

Ordnance Survey product Data Currency Last updated in Data Download
OS MasterMap® Integrated
Transport Network™ (ITN) Layer
May 2014 September 2014
OS MasterMap® Topography Layer May 2013 September 2014
OS MasterMap® Topography Layer Raster June 2013 October 2013
Backdrop Mapping
* 1:10 000 Scale Raster June 2013 October 2013
OS Street View® April 2014 April 2014
1:25 000 Scale Colour Raster March 2014 March 2014
OS VectorMap® Local Raster July 2014 September 2014
OS VectorMap® District Raster September 2014 September 2014
1:50 000 Scale Colour Raster June 2014 July 2014
1:250 000 Scale Colour Raster June 2014 June 2014
MiniScale® January 2014 January 2014
Land and Height Data
* Land-Form PROFILE® November 2009 September 2011
* Land-Form PROFILE® DTM November 2009 September 2011
OS Terrain™ 50 July 2014 September 2014
OS Terrain™ 50 DTM July 2014 September 2014
* Land-Form PANORAMA® November 1993 September 2011
* Land-Form PANORAMA ® DTM June 2006 September 2011
Vector Data
OS VectorMap® Local July 2014 September 2014
OS VectorMap® District September 2014 September 2014
Meridian™ 2 January 2014 February 2014
Strategi® January 2014 February 2014
Boundary and Location Data
Boundary-Line™ May 2014 May 2014
Code-Point® August 2014 September 2014
Code-Point® Open August 2014 September 2014
Code-Point® with polygons April 2014 July 2014
OS Locator™ May 2014 May 2014
1:50 000 Scale Gazetteer June 2014 June 2014

* Land-Form PANORAMA®, PROFILE® and 1:10 000 Scale Raster products have been withdrawn by Ordnance Survey and are no longer updated.

Other Updates

Remember there have also been substantial changes made to the Geology and Marine data holdings in recent months:

Please also note that there are a lot more formats available than before so you may not have to convert the data after you have downloaded it:

It is now easier to see which formats are available for the data product you are downloading and change it if you need to, this blog post shows you how:

Please let us know if you have any questions about the new data or any other changes and additions to the Digimap service

  • Email: edina@ed.ac.uk
  • Phone: 0131 650 3302


 September 25, 2014  Posted by at 11:34 am Data Changes & Additions, Digimap News, New Features Tagged with: ,  No Responses »
Sep 242014

We have updated all the new style Download interfaces in Digimap to make it easier to change the options for the data you have selected. When you have added some data to your basket you can now see more clearly what can be changed; the version (date and style), the format and the layers.

New Download Basket

Where these options are highlighted in blue with a next to them you can click them and get a drop down menu with the alternatives.

New Download basket with open menu

Make the changes you need, add a name,  then click on the Request Download button to order your data.


We have also made a slight change to the Product Information pages for some of the data products too.  Where the data is available on tiles you can now click a Show Grid button to see the tile outlines on the map.  When you make a selection on the map you will get all the grid tiles that are partially or entirely within your orange selection area.

How to View a Grid in Data Download

This feature is particularly useful for data products like the 1:10,000 and 1:25,000 scale geology data and the Marine HydroView Charts, where the data isn’t a continuous coverage and your selected area may contain no tiles. When you click the Show Grid button it will also automatically open the menu from the right of the map which allows you to change the grid shown or switch it off.

Remember, you open the Product Information panels by clicking on the blue Info links in the list of data products to the left of the map.

Please let us know if you have any questions about these changes or anything else:

  • Email: edina@ed.ac.uk
  • Phone: 0131 650 3302
Aug 062014

We have now added detailed 1:10,000 and 1:25,000 geological mapping data and 1:625,000 hydrogeological data to Geology Download. The mapping data is the most detailed geological mapping available from the British Geological Survey (BGS), consisting of four polygon layers:

  • bedrock geology (‘solid’);
  • superficial deposits (‘drift’ or Quaternary)
  • mass movement (mostly landslide)
  • artificial (or man-made ground)

…and one linear features layer, including:

  • thin beds (such as coal seams and fossil bands)
  • faults
  • mineral veins
  • some landforms

The products cover around 30% of the UK most of the which is provided by the 1:10,000 scale data (approx 28% of the UK), as the coverage is not complete we have added overlays to Geology Download to guide you to the areas where it can be found. The BGS have more map tiles in production for this dataset and we will add them to the service as they become available to us. The coverage of the data can be seen in the Download interface below as the dark blue areas of Great Britain.

Geology Download showing 1:10,000 scale data availability

The 1:25,000 has a much smaller coverage, but with little overlap with the 1:10,000 it makes a good companion dataset to fill in the gaps. The 1:25,000 data also fills the gaps in the 1:50,000 data in Wales, again the coverage can be seen in the interface below as the dark blue areas.

Geology Download showing 1:25,000 scale data availability

Grids / Overlays for Geology DownloadTo access the coverage maps click on the double arrow to open the Show Grid / Overlay panel on the right side of the map; you can then add the grids to the map to see where data is available. We highly recommend you do this before selecting your area for download as it is not possible to add data to your basket if you choose a product with no coverage for your that area. If you find that the Add to Basket button remains grey after you have selected your products it probably means that one or more of them do not have coverage for the area you have chosen.  If you click on the grey button an error message will tell you which products are unavailable, either remove these from the order or switch on the overlay and choose a different location that has the data.

GEology Download Products in July 2014In addition to the 1:10,000 and 1:25,000 scale mapping datasets we have also added the BGS OpenData 1:625,000 scale digital hydrogeological data to Geology Download. This data can be used to indicate the aquifer potential of an area in generalised terms. Areas are categorised into one of three groups:

  • those in which intergranular flow in the saturated zone is dominant
  • those in which flow is controlled by fissures or discontinuities
  • less permeable formations including aquifers concealed at depth beneath covering layers

The 1:625 000 scale data may be used as a guide to the aquifers at a regional or national level, but should not be relied on for local information.

The addition of these new datasets brings the total number of different BGS products in Geology Download to 14.  We hope to add the detailed mapping datasets to Geology Roam later in the year as well as some of the other types of data as overlays or basemaps.

If you have any questions about the new datasets or any requests for geology data we don’t yet have then please get in touch:

  • Email: edina@ed.ac.uk
  • Phone: 0131 650 3302
 August 6, 2014  Posted by at 10:00 am Data Changes & Additions, Digimap News Tagged with: ,  No Responses »
Jul 292014

Marine Digimap has updated datasets in its brand new Marine Download facility.  The new interface provides a single place to get both Hydrospatial data and charts. The Hydrospatial data is now the newer Hydrospatial One dataset from SeaZone which also provides access to much higher resolution gridded bathymetry data, one arc second instead of six.  The Charted Raster dataset has been replaced with Hydroview Charts; these are still the same scanned Admiralty Charts though they are much more recent and include some small scale charts covering whole oceans (still not for navigation though!).

Version of Marine Downloader Launched in July 2014

Marine Download overlays listMarine Download works in the same way as Data Download, Environment Download and Geology Download; you select your area, pick your products and download the  data. The main difference is with datasets as these don’t all form continuous coverage of the UK Coastal waters. We have therefore included overlays that show where the tiles of each dataset can be found. These are very useful when choosing your data, just open the “Show Grid / Overlay” menu on the right of the map and pick the overlay for the data you are selecting.

Remember there can be a lot of overlap between the Hydroview Charts and also large areas with no data, you will get all the charts that present for the area and scale band that you choose. If there is no data for the area you have selected then you will not be able to add anything to the basket, if you try you will get an error message informing you of the products with no coverage.

For a full description on how to use the new interface, have a look at the help page: How to use Marine Download

List of Hydroview Charts in Marine DownloadThe data in Marine Download is broken into three categories, Hydrospatial, Hydroview Charts and Bathymetry. In the Hydrospatial category the same themes as before can be found, the Hydroview Charts are broken up into categories depending on their scale including those extra charts with wider ocean or global extents.

The old interfaces for downloading data, Hydrospatial Download and Charted Raster Download, will be withdrawn in the coming weeks, if you have teaching materials that include them then please update these as soon as possible.


If you have any questions about the new download or the withdrawal of the old interfaces then please get in touch:

  • Email: edina@ed.ac.uk
  • Phone: 0131 650 3302
 July 29, 2014  Posted by at 2:37 pm Data Changes & Additions, Digimap News, New Features Tagged with: ,  No Responses »
Jul 152014

We are getting a few reports from users experiencing issues when printing PDF maps using Chrome web browser. Unfortunately, the map watermark and any semi-transparent annotations are being printed as solid filled features. We have discovered that this occurs when the PDF is viewed and printed from within the Chrome browser, using Chrome’s in built PDF plugin.

We have investigated this problem and discovered that the bug is with Chrome and occurs with all PDFs with semi-tranparent layers not just those created by Digimap. The problem only happens on Windows computers with versions of Chrome released since the end of April (Chrome 34.0.1847 onwards).

Digimap users with Chrome should see a warning when they open the Print… window, informing them about the problem.

Print interface showing chrome alert

As the alert message suggests you can disable the plug-in to prevent it from opening PDFs or you can use the following workaround.
Printing from Chrome alternative method

  • Create your printable map as normal it will appear at the bottom of your web browser.
  • Don’t click the file name to open the PDF, click the little arrow and choose ‘Open with system viewer’.
  • Providing it is installed on your machine, the map will open in Adobe Reader which will print the semi transparent layers correctly.

We hope that the issue is resolved by Google who develop the Chrome browser, it has been reported to them. Should you wish to disable the plug-in entirely then you can follow the instructions with the image below. You can also download the map as a file to your computer, and then open it directly with Adobe Reader, this is usually the best option anyway as it allows you to print the PDF multiple times or store it if you don’t want to print it just yet.

Disabling Chrome PDF plugin

  • Type Chrome://plugins into the web address bar in chrome, this opens a list of all the plug-ins you have installed for Chrome.
  • Scroll down to the Chrome PDF Viewer on the Plug-ins page and click on the blue “Disable” link.
  • You will no longer have the option to view PDFs in the web browser.
  • Use this page to reactivate the plug-in if you want to reverse the change in the future.


If you need any assistance with opening your PDF map, please contact the EDINA helpdesk:

  • Email: edina@ed.ac.uk
  • Phone: 0131 650 3302
Jun 262014

Geoforum 2014 RegistrationEDINA’s Geoforum 2014 was a great success with an audience of around 80 delegates and EDINA staff all enjoying an informative and entertaining programme.  The aim of the event was to engage with users and support staff and highlight new features in EDINA’s core Jisc funded services.

Other than this summary there are several other ways to find out what happened at this event:

Keynote Address

Peter GibbsPeter Gibbs is a well known for his BBC and Met Office weather forecasts and in his presentation he showed us the work that was being done by the Met Office and Environment Agency in combining their data and resources to predict not just where there is going to be rain but where the rain is likely to cause flooding.

The slides for Peter’s talk can be viewed here:

Flood Forecasting – Peter Gibbs

Peter has a keen interest in improving public levels of understanding of science through better communication and this was very evident in his presentation.  He managed to explain the complexities of how weather data was collected and used for forecasting in a very accessible way, leaving the audience with a much better understanding of how our weather and in particular flooding is predicted. We also got to see just how much data the met office use, up to 70 recordings at different heights for grid cells up to 1km2.

Peter Gibbs at Geoforum 2014

A key message implied in Peter’s talk was the critical importance of be able to create usable and appropriate information for your audience from the large amounts of complex and varied data available. This is something that as scientists we should all aspire to.

Data Usage

The next presentation came from a recent MSc in GIS Graduate, Darius Bazazi. The presentation followed on from the keynote nicely as it show how various datasets, some from Digimap, some from other sources, were combined to improve the science behind panning natural measures to control flooding. The slides for this presentation are available here:

Using EDINA Datasets in a Hydrology Project – Darius Bazazi

The main issue tackled by Darius was to improve a key variable that decides whether an area is suitable for natural flood management, PROPWET (the proportion of time soils are wet). Darius used datasets such as the Centre for Ecology and Hydrology’s Landcover data to try and improve this value, bringing in vegetation cover and land use as factors that could influence it.

Darius’ talk showed how important the availability of data from a wide range of sources is key to research such as his. A large amount of researchers time, from graduate level upwards, can be taken up by finding and requesting / ordering data; Digimap and other geoservices from EDINA exist largely to reduce or eliminate the wait for data.


Carol Blackwood finished off the morning’s presentations with a talk about the improvements EDINA are making to the support of their geoservices. The slides for this presentation can be found here:

Geoservices Support – Carol Blackwood

Carol first told us about the work under way to replace the registration system for the Digimap collections. The main piece of good news is that there will no longer be a wait between registering and accessing the service. The new system will use the familiar email confirmation link to allow access once the registration form has been filled in.  This will ensure that users have speedier access and also that their email is correct for collecting their downloaded data.

Carol also highlighted a few other things we are doing to support our users and also to improve the information available for site reps to promote Digimap. Firstly there is the new live chat system that allows people to have a text chat with the user support team.  The chat window can be accessed from any of the help pages or the resource centre where it appears as a pop-up:

Other resources highlighted in the presentation were:

Finally Carol talked about the training offered by the Geosupport team, if you are interested in attending or hosting a training course then please get in touch with us: edina@ed.ac.uk.  We can also arrange to do some bespoke training workshops or webinars for certain subject areas.

Geoforum 2014 LunchLunch

During the lunch break we were treated to some delicious food and got to see some interesting data and software from the following people.

Most importantly we were given plenty of time to speak to others at the event, sharing ideas and thoughts on the presentations and demonstrations we had seen.

Service Updates

Guy McGarva brought us up-to-date with all the latest additions and enhancements made to Digimap over the past year, and then went on to highlight the improvements we will be introducing next. The slides for this presentation can be found here:

Digimap Update – Guy McGarva

Guy showed how we are now using the results from the impact surveys and other contact with users to guide and prioritise the developments to the service. He also highlighted the huge amount of work that has gone on in the past year to the back-end architecture of the service that will make it much more reliable, maintainable and flexible going forward.

In the near future you can expect to see a new Download interface for Marine, after which will follow a similar upgrade to Historic Download including a better basket for all the downloaders, where you will be able to select data formats and dates more easily.

Have a look through the slides to see more of the upcoming improvements to the Digimap Service.


The last part of the day was a welcome trip outside in the sunshine to see how Fieldtrip GB could be used for Citizen Science projects. The slides for this presentation can be found here:

Citizen Science in your Pocket – Addy Pope

Geoforum 2014 FieldtripAddy Pope led the group through an exercise where a form was designed to collect data about graffiti which was then deployed to the groups smart phones and tablets. Everybody then took their devices outside and began collecting data such as photos and the type of graffiti found. The devices take care of logging the location, though the app does allow you to modify this if it isn’t quite right.

Fieldtrip GB had some glowing praise from the delegates, everyone seemed to find the it intuitive and. easy to use. A very good way to crowd source data without an expensive outlay.

Final Summary

A very big thank you to all those who attended Geoforum, we at EDINA feel it was a tremendously successful event.

The days presentations an exhibitions told a story, starting with the importance of geospatial data, especially to bodies of such strategic importance as the Met Office. We heard how research carried out at academic institutions was adding to the accuracy of the data and the models that use the data, and how important access to the data was in order to carry out this research.

The exhibitors showed us the next generation of data along with some of the software to process it. And finally, the presenters from EDINA showed what we are doing to improve the accessibility to the data; not just by making it available but by providing the support and training to realise its full potential.

Jisc cJISContinue to provide funding for services such as Digimap, GoGeo and Unlock which form the backbone of geospatial services available to the academic community in Great Britain.  The Geoforum allows funders and service providers to engage with the user community which in turn helps focus service improvements.

A final thank you to everyone who helped make the event happen, the EDINA staff who helped organise, present and pack away all the equipment; the exhibitors; Bluesky Catering; and the staff at the Informatics Forum.

 June 26, 2014  Posted by at 9:40 am Digimap News, Of Interest, Training & Events No Responses »
Jun 192014

GeoForum 2014 (#geoforum2014) takes place at the Edinburgh University Informatics Forum from 10am until 4.15pm today. Throughout the event we will be liveblogging so, whether you are able to join us or not, we suggest you bookmark this post and take a look late tomorrow morning for notes from Peter Gibb’s keynote. Keep an eye on the post throughout the day as it will be updated after every session. We also welcome your comments (below) whether during or after the event.

As this is a liveblog there may be occasional spelling or typing errors. We also welcome additional links, resources, or any corrections to the text if you spot them. 

Welcome – Helen Aiton, EDINA

I’m Helen Aiton, User Support Team Manager at EDINA and it is my pleasure to welcome you to the EDINA Geoforum 2014. We hope you will have a very enjoyable and interesting day. We have lovely weather today for our citizen science excursion. We also want to thank our data providers who are here today, and who you will have an opportunity to speak to during lunch and during our breaks. With that I will hand over to Tom Armitage to introduce our keynote speaker.

Tom: Our Keynote Speaker is Peter Gibbs, BBC Broadcast Metearologist. We booked a weather forecaster and they brought the lovely weather with them! We’ll do that again! Peter will be talking today about flooding.

Keynote Speaker: Peter Gibbs, BBC Broadcast Meteorologist

When Tom first asked me to give this talk he mentioned that hydrology data has been added to Digimap and I thought, what would be appropriate to that? Well I thought something that shows several databases and data sources being brought together would be appropriate, so…

Let me take you back to 2007, summertime in England and Wales. There was severe rain, several people lost their lives, 55,000 homes were flooded. And the government came in for a bit of stick so they set up an inquiry. This was completed in a year ago. The Pitt Review recommended that the Environment Agency and the Met Office should work together. Previously had been a bit of a dichotomy. At the Met Office we predicted the weather, once it had happened we washed our hands of it. And it worked the other way too… it wasn’t exactly joined up. So the recommendation was that this joint grouping should issue warnings together and to do so at a lower threshold – to allow for better preparedness, which might mean false alarms but also better safety and more time to plan.

So the Flood Forecasting Centre was set up in 2009 and it’s a new set of experts basically. Meteorologists have been trained in hydrology, hydrologists have been trained in meteorology and that team works together and operates 24/7.

So, we predict the weather… but weather is a messy thing. This is difficult science. Lets look at a shot of data from 4th February 2009 based on radar data. In weather terms a block of major storms appears here. It’s not very big… but it happens to be on a hill between Exeter and St Austell. And on that hill the rain fell on snow. That caused traffic chaos. But only because it was both intense rainfall, in that spot, at that moment, at that temperature… that complexity is the stuff weather forecasters have to do all the time. And we are increasingly asked to not just predict the weather but also predict the impact and that can be really difficult to do.

So, how do you actually do a weather forecast? Well you start with what’s happening now. Observations are key to setting up a good weather forecast. Surface observations, taken by people on the ground, are an important part of this. Coverage around the world are various… on land in Western Europe is pretty good, Africa less so, high altitudes less so, and over the oceans also big gaps. So you can see already that it’s an incomplete picture.

Radar is good, provides ground truths, very useful but tends to only be over land areas and radar is expensive. We also have a network of upper air measurements via weather balloons… but that network is even more sparse. It’s sending up a packet of electronics once or twice a day that you never get back! And we do also get observations from aircraft. Almost every commercial flight gathers windspeed, direction, temperature, and send that back. It does cover the Ocean but still more Northern than Southern hemisphere. About 130,000 observations a day though so very useful.

Weather satellites cover a great deal of the gaps now though, several geostationary satellites are the real workhorses for weather data here. The location/proximity to the equator varies, but we get great imagery that provides useful parameters on a grided basis. You can get a profile all the way down the atmosphere – not as good as balloons – but still very good especially for sparsely populated areas. And weather forecasters are increasingly data hungry…

Now all that data means there is an awful lot of number crunching to be done. The Met Office has been at the forefront of this, more to do though. But we have a “Unified Model” allowing us to do a full forecast of the globe. That’s important to do for forecasting. Can take that model down to lower areas, we can nest models, but it lets us generate weather forecasts at lots of levels from that model. Now the model data has to all be gridded. The grid ding is getting smaller and smaller as more sophisticated computer modelling lets us reach greater and greater detail. Not long ago our grid resolution was 40km. So for the UK, that’s not that many grid squares. A high resolution 4km model, which we use now, is far more accurate and far more like the mapping you’d see in your atlas.

The model we use for the weather forecasting and for broadcasting is the 4k model. It’s 1.5 x 1.5 km squares across the UK. But you come across a problem. Because weather is global, you have to have stuff happening at the edges, to feed coming weather into your model… weird things happening if you don’t do that and it won’t be that useful. So it’s kind of blended… we have lower resolution at the edges, then more detail at the centre. But we are talking about a three dimensional atmosphere. That’s the horizontal grid area. We have 38 vertical grid levels above that. You can see how hefty the computing power needed is for this sort of forecasting – bringing in that level of modelling detail along with thermodynamic calculations.

So, that model generates a model predicting weather… but you have to be careful in how you interpret the signal from the noise – an increasing challenge as data detail becomes more and more high resolution. There is always a learning process whenever we change to a new data model, and a new data resolution.

It’s not just weather that goes into this model. Other things have to be brought in. Soil type data has to be brought in. Sandy soil heats quickly in the day, cools quickly at night, transmits heather. A more clay soil will be slower to heat or cool for instance. Soil moisture content will also make a big difference. More moist soil slows heating and provides more moisture for clouds. Vegetatian type matters a lot, especially at mid level where we are. And vegetation type changes hugely over the year. And that coverage hugely changes solar radiation and temperature. Sea surface temperature is very important for tropical cyclones. Sea-ice thickness matters a lot – more white ice and there will be more reflection of solar energy. Thickness changes warmth of ocean as well as reflectiveness. Sea surface currents also matter – for transporting warm or cool currents. Some of this data is based on observations, some are based on average annual variation.

There is one big enemy of weather forecasters… and that’s chaos. In theory if you had a perfect representation of the atmosphere and a perfect model then your forecast could be perfect. Now that perfect representation and model cannot be totally perfect, always lots of variation. So, to deal with variation we use something called “ensemble forecast” – making many different forecasts and comparing them effectively.

So, lets take a one day model – a “spaghetti chart” – this shows changes building up… and when we do that ensemble forecasting – and we compare these on a “postage stamp” print out – we can start to see the variance across the forecasts. Can talk in terms of probability… so here we could see northwesterly winds, chilly… Sometimes these all look totally different then the weather is really changeable and unpredictable and that’s useful to know. If they are all very similar we can have a lot of confidence in the prediction. That confidence rating on the forecast is a really useful thing. So here is we look at a “Whisker diagram” we can again see this idea of comparison and confidence in the predictions.

So how does that forecasting translate to the work at the Flood Forecasting Centre? Well we take that forecast, put into a hydrology model with grid to grid modelling. There are four types of flooding: fluvial flooding – rivers overflowing; pluvial flooding – rapid downfall that drainage can’t cope, dips in road, “flash flooding”; coastal flooding; and groundwater flooding – tends to be delayed reaction, saw a lot of that in winter flooding we’ve just had… where water moves through the system and causes belated effects such as underground flooding.

So, for instance, Cumbria 2009. We take the weather model. Data comes into the grid to grid model… each grid points takes water falling out of the sky, The model has relief, river flow, soil moisture, etc. data. And so it tracks where water falling will land, where it will go next etc… water starts to fall and flow into river systems, heading off down out to sea… this is shown in Cumets(?) per second. It’s reasonably high resolution data. That’s then converted into return period flows (QT grids). This is “above 2 yr flood”, “above 100 yr flood” etc. And from that we then get a flood guidance statement. But that’s where human intervention becomes so important, interpreting that guidance statement, and understanding impact, infrastructure.

So in terms of assessing the flood risk for Lowestoft as an example. This uses ensemble forecasting again. We see at 5 days here some uncertainty. At 4 day forecast that variation is more varied. And then 2 days ahead all ensemble forecasts come close together – something big is on the way. The big uncertainty there was how the flood defences would hold up. Defences had been built after last major flood in 1953 but not tested. They did hold. Everyone wondered what all the fuss was about!

Just to finish off… what the Flood Forecasting Centre is moving towards is much more focussed assessment of output. So when my son graduated in Durham there was huge rainfall and flooding… A Victorian wall fell in that rain. The ultimate aim is to be able to run predictions that go to that sort of level – this road, this area, will be at risk. In shorter term forecasts it might be about getting an hour or so warning with radar help. But we are moving in that direction.


Q: You talked about the 1km accuracy is way off. What holds that back? Number crunching? Data available?

A: Largely number crunching. The Met Office just had another super computer upgrade but it becomes harder to justify the substantial expense without this sort of clear impact of data. But there is also the issue of how accurate it gets at that level of data. So we are at a sort of holding point? Some say higher and higher resolution is better, some say more ensemble predicting is better.

Q: When you put up grid for the UK with 4×4 at outside, why not take more data from Atlantic as so much of our weather from. And is there thought about variable grids – as such variable impacts.

A: Well we do have edges of edges – at the edges of the 4×4 is the global pattern. But you can only run so much computing power across that grid. All about competing factors. Have to have best possible solution. On the other question part of the ensemble effort is about what the crucial areas in a forecast are, that impact the weather downstream, and you can focus on those to get better quality forecasts for the UK. People are looking for this.

Q: I’m impressed by the increased accuracy of the weather. Is it about data accuracy, better models?

A: All of the factors I talked about: observations; modelling; data; and also the research that goes into our forecast models. And it is so much better than it was. I’ve been weather forecasting for 30 years. When I started 24 hours ahead was good, now 5 days ahead is pretty standard.

Q: How important in climate change in those flooding models

A: Angharad is probably the person to answer that – we can talk more later – but obviously the theory is that a warmer atmosphere means more moisture in the atmosphere, more rainfall and more flooding.

And after a lovely break, featuring caramel wafer,

Using EDINA Datasets in a hydrology project: Darius Bazazi, Hydrological Assessment of Natural Flood Management

I work as a spatial data analyst at GeoPlace in London. GeoPlace is owned by the Ordnance Survey and we provide definitive national address and street gazetteers (datasets) – essentially we do what EDINA do for the Higher Education sector, but for the public sector.  So the AddressBase features point data for every address in Great Britain – will full postal address and a unique property reference number – allowing tracking across the lifecycle of a property.

So, a quick example of how Address data is used in Flood Management… we have zones that show high, less high and low risk. So for this, for instance, allow emergency services to target resources in flood events. And also the property data lets us estimate more specific risk. Why does this matter? Well when you don’t have good data… Thames Water was fined £86 million for overestimating flood risk for instance.

So, today I’ll be talking about hydrology and talking mainly about maps. I know we have a room full of experts here so I hope you’ll pick me up on anything I get wrong! So, I used three EDINA datasets in the project I’m talking about – Land Cover Maps 2007, Digital Terrain Model (DTM), and some OS? mapping. I really enjoyed using the EDINA Digimap and the GUI there.

The project was looking at soil moisture and to calculate a regression-based equation for PROPWET (soil moisture). Currently engineers use a constant value for catchment PROPWET (Flood Estimation Handbook) so we wanted to test and develop a regression based model. And GIS is a good tool for that because we can bring in so many variables etc.

Natural Flood Management involves using nature to mitigate flooding – currently difficult to analyse that. But this equation sort of feeds into that. Looking at how we can vary different ground conditions in the modelling. For instance NFM is used at Eddleston Water in the Scottish Highlands. There is 69 square KM waterway that is part of the River Tweed Catchment. So, PROPWET measures the proportion of time that soil is wet. Value is currently taken from the Flood Estimation Handbook. But that Handbook is quite old and gives quite static measures of moisture.

Looking at the river network in schotland, and the river network monitoring stations, we can see how data is gathered and what it’s coverage is. Using a Halcrow Analysis we can see variation of data across Scotland. It’s a pretty good but we wanted to broaden that analysis across Scotland. So, the Land Cover Map 2007 was really interesting to work with. I obtained vector and 1km raster. My desktop GIS couldn’t handle the scale of the vector mapping so I had to use the raster mapping (which is based on the vector). This covered Scotland across river catchment. LCM2007 is a parcel-based classification of 23 land cover types.

So I took that raster data. I generated some catchment. You can get them from the Centre for Hydrology and Meteorology but we thought it might be interesting to generate our own catchments. So this we took the Digital Terrain Model and made some transformations to produce catchment delineation. I wanted to get areas within in 10% accuracy, only got to 17%. So I decided to use the Centre for Hydrology and Meteorology shape files instead. And then I did some geo processing, using ArcGIS python model builder to iterate zonal stats maps. And looking at potential evapo-temperature models. And looking at Lead Area Index (vegetation cover) and Surface Resistance.

The result was an equation that is appropriate to Scotland but, had this not been a university MSc project it would be nice to expand the model and cover more of the UK.

The OS have a new dedicated water product, a new Water Layer (beta) which has been created with the Environment Agency, SEPA, INSPIRE. And that will have huge potential I think.


Q: How did you choose the size and location of the catchments? How do they choose their catchments?

A: I did used the Centre for Hydrology and Meteorology catchment areas in the end. I’m not sure how they select their catchments, an interesting question though.

Digimap Support: Carol Blackwood, EDINA

I will be talking both about what we have happening at the moment and what developments are coming up.

Firstly, we have a new Digimap Registration System coming soon. Currently users come in, they need to register before accessing the data. We ask you what you want to use, who you are, and we give you a lot of legal terms, we ask for lots of information and ask you what collections you want access to, you see a summary and submit that… then you have to wait. About 24 hours but up to 48 hours. Currently we have to eyeball every single registration, manually approve it, we run a script overnight, and then you get access.

So, for services these days that’s rather below expectations… we are used to giving information up but you are used to then getting access after a few minutes. And we know some students don’t remember to register until quite last minute. Their expectations are high and for fast registration. And our new licence has enabled a few changes… so we have been working on a new registration system.

As we began to redesign the process we have also been using “user stories” – pathways that users will take as they encounter our system. We had some requirements too. We took the user stories and some of our requirements to work out our priorities… what MUST we do, what SHOULD we do, what COULD we do… ?

So, the new system will be a lot simple. A new user will come in and login via Shibboleth credentials. You will register with your user details. You will get an email to validate your email address. And once that is validated you will be able to activate access to collections – choosing selections, agreeing to terms, and defining the purpose you want the data for. And then you will have access. This process should be 10 minutes ish, not 24 or 48 hours. This should much better meet user expectations, including those who have registered a wee bit late for their deadlines!

When we roll out the system we won’t have any change for existing users – no need to register again. We will renew activations after 12 months – a new change but you won’t need all the initial data entry again. It will be very simple, no interruptions. And we are adding a new registration for Historic Collection users – that’s so we understand who is using that service, so that we can support them better. Again it will be quick and painless.

Clearly this is a big change for us. We are anticipating it around early September 2014. To ensure a smooth transition we will provide help pages, how to guides, videos, webinars, and anything else that would be helpful – let us know what they might be? We can provide text for handbooks for instance. Just let us know. But this will be lovely. It will be simple, easy to use, and much quicker. And it benefits us as at peak time our help desk has to focus heavily on registration at the moment, they will be freed up at those peak time for other support activity.

The other part of my talk is about some recent updates. The first of these is out new Live Chat function – you can find this from our Help Centre, and you’ll be familiar with this sort of functionality from other websites but basically it lets you ask live help questions via text chat. So come and chat with us, and let your users know they can ask us questions this way.

We have also been creating a whole new series of online videos – for specific tasks and tools – as well as number of new service demos. Both of these sit within our help centre. The demos help explain the interface and how to get started – can be really useful to promote or support users in getting started with Digimap.

We have also been creating some information for you to take and use on your own local pages: descriptions of all the collections, logos for Digimap, sample images, textual information. This is always the definitive source of up to date information on the services – you can always come here to get text and resources for your own use in webpages, documents, etc. So please do have a look and make use of this.

And we still have our traditional in-person training courses – in Derby, Oban, Bradford, Swansea, Southampton, London, Newcastle. If you’d like to host one just let us know and we’ll be happy to work with you, to let you know our requirements etc. We have also been out doing Digimap workshops tailored for history students; Digimap and GIS workshop for earth and environmental science MSc students, etc. If there is something tailored that you would like, just let us know and we will be happy to help.

We have also been out and about running workshops: British Cartographic Society Map Curators workshop; Enhancing Field Learning; Research Data Management for geo data, etc. And our colleague Tony Mathys has been running metadata workshops and giving a paper at GEOMED 2013. If you are interested in metadata, Tony is your man, just get in touch.

We are also running an increasing number of Webinars, run by our colleague Viv Carr, and these are very flexible 1-2 hr sessions usually, easier to fit into a working day often. We have a series of these on Digimap Collections, Roam, Historic Digimap, etc. And we also take requests, so just get in touch!

And, if you want to find out where we will be do take a look at the new Events page on our blog, keep an eye on the EDINA Events page – where training sessions and webinars are advertised – and keep an eye on GoGeo for more on Tony’s Metadata workshops.

And… a final note on what’s coming shortly… our engineers are currently working on Site Rep Stats for all collections, as well as the next run of annual calculation of data values used by your institution – and there will be more information about that.

So that was a very quick round up of what’s happening. Basically, if there is anything we can do to help just get in touch!


Comment: Just to reiterate something you said. The tailored session on GIS for history students that you did for my masters students was just brilliant. I would recommend that. It was great for my students to see their subject from that different geospatial perspective, really useful for their understanding. And there are a lot of really good resources there for those wanting to carry on that learning.

And now time for lunch including Service demonstrations; Project demonstrations; Meet the map providers; Share your Vox Pops on Digimap.


During the next sessions the group will be splitting into two groups. One (those with green dots on their badges) will start with the Citizen science in your pocket: Excursion, the other will start with the Digimap Update session. This blog will, however, be sticking with the Digimap Update session throughout the afternoon.


Digimap Update – Guy McGarva

I will be covering what we’ve done in the last year, what we’ve got coming up and what we will be doing in the future.

We have been listening to your feedback, through surveys, support requests, discussions – including chats at training events, events like today, etc. and these feed into our Quality Improvement Plan.

Since the last GeoForum we have launched a new collection, Environment Digimap. This is data from CEH and includes Land Cover data from CEH for 1990, 2000, 2007. We have made improvements to all of the Roam mapping clients within Digimap. Roughly a year ago we highlighted that these were coming, these have now been implemented. There is a new interface for all roams so that they are now cleaner, more consistent interfaces. They are all essentially the same with minor changes for different collections. This makes updating, maintenance, easier but also makes them more consistent for use and exploration. Whether coincidence or not we have had greatly higher usage over the last year. We had well over 1 million screen maps generated in the last year – we must be doing something right! We have also seen general usage on the rise, downloads too.

And we have also made significant changes to the print interface in all Roams based on feedback we’ve had from you, from users. So part of that is the layout review – seeing the area you are printing – no more wasted tie creating PDFs of the wrong area – we’ve seen PDF generation go down despite usage going up, we think that’s because users are getting the right map first time much more often. We now have variable scale printing in all Roam except Historic. You can choose the exact scale you want of your map and can print at any size from A4 to A0, whether PDF, JPEG, PNG. And you can now print overlapping maps in Ancient Roam. Some fairly significant improvements there.

We have also improved search. It doesn’t mean much visible change but we now include street names and partial postcodes in searches – providing more options for exploring the data.

And finally… we now have printable Legends. You can create a printable PDF legends in all Roams except Historic. And Geology Legends are now Dynamic – i.e. they only shows features on the map. And we might, in future, that concept of dynamic legends to other areas of Digimap. The printable legends come as PDFs, you can add them to your map, etc. So that’s what we’ve added in the Roam client.

Over the year we are trying to migrate all of Digimap to the new Data Download Clients, based on the OS Download client. We started by launching a new Geology Download – this went live in August 2013. And we have added new data: Northern Ireland Land Cover Data from CEH and more Geology data from BGS, including flooding and permeability. And whenever we make updates to data, or to interfaces we post these on our blog – any big change will appear that.

One of the most significant changes has been that, for a long time, OS MasterMap data was only available in GML format.. that’s how we get it from Ordnance Survey but it’s not easy to use in GIS programmes – it always required some sort of processing or conversions. For some users that was almost impossible, we has a lot of questions from CAD users about this, and generally this format issue was particularly difficult on a Mac. So, we now have a way to convert data to DWG on the fly. This has already been implemented and makes life a lot easier, particularly for CAD users as most CAD systems will support DWG files. Feedback on this has been really good, it lets people use data right away rather than having to make their own conversions.

That’s what has happened. We are now working on providing MasterMap as File Geodatabase – that will make it easier to load into ArcGIS and QGIS. And we are using FME for on-the-fly conversion.We won’t be doing shape files, as the shape files generated from this data are too massive to process hence using File Geodatabase. That will be coming soon. Also Ordnance Survey have data on building heights (alpha version). This is a very new thing, an alpha product and a first release… but we want to make it available to users. We will be taking that data on MasterMap building heights and providing that, initially at least, as KMLs – and you can see a demo of that in our demo area today. That’s all buildings across all of the UK, and that data should be really interesting, particularly for architects. Eventually that data will just become part of MasterMap.

And we are keeping on adding new Basemaps in Roam. These are variants at a particular scale or different data set. This gives you the option of different Basemaps at different scales. So VectorMap District 0 raster and vector versions will be available as alternatives to 50K. And we will be replacing OS 10K Raster with VML Raster – that’s because OS are withdrawing 10k Raster later this year. And we will hopefully also be adding 3 versions of Miniscale too.

We have new improvements coming to Marine, a fairly major update in fact. We will be getting new High Res Bathymetry data for Marine – now 30m resolution rather than 180m, so that’s a really massive difference. And we will be getting updates to the Hydrospatial and Raster Chart data. And there will be a new Data Download Client – in the same style as the other Digimap Collections.

We have had a project with the National Library of Scotland – who digitised new metadata for all the historic maps we had. That’s resulting in improved metadata for Historic data in England and Wales, and improved dates for publication and survey (Scotland already has it). And, again, we will have the New Historic Download client.

Still to come…

We are working on some updates to the Data Download basket – to make it easier and clearer to select different formats and dates, which isn’t hugely easy to find right now. When we get a new data set we don’t just delete the old data… you can go back and select last years data or data from two years ago – you can get versions of the data as well.

We have some small changes to make to the Digimap Home Page. Because we have rationalised our clients we don’t need so many expanding menus etc, just make it much more tablet friendly. And, as mentioned earlier, we will also have Site Rep Stats for all Collections – so as a site rep you will be able to see data on how Digimap is being used in/by your institution. We would love your feedback on any of these aspects, and we will continue to inform you of changes like this through the Digimap blog.

And one thing we hadn’t really mentioned yet…

We have a new service coming later this year is Digimap for Colleges. This will be based on the Digimap for Schools functionality – which means simpler somewhat stripped down functionality, and a Roam-style mapping client. This bridges the gap between schools and university, particularly useful for those doing A levels for instance. And it means FE institutions will be able to use Digimap without needing shibboleth support – but FE institutions will be able to choose between Digimap for Colleges or Digimap for HE. So that will be coming pretty soon.

We are also working on other geospatial stuff. We are working with the UK Data Service Census Support – which will become “Open” shortly meaning no login will be required to download most of the data. Some data sets will still require login, but most will become open. That’s all Census data, Scottish Census data, etc.

We have also making changes and enhancements to other geo services. Unlock has been expanded to include historic English place-names, data generated through the DEEP project, and improvements made to the API to allows you to search for these old place names. And GoGeo will have added language conversion support, support for metadata standard Gemini 2.2, and updating GeoDoc resources, and also a mobile version of the site.

In addition to our services we are also working on a lot of different geo projects at the moment. COBWEB is a major EU-funded project with partners across Europe and involves citizen science in Biosphere Reserves. We have the Palimpsest and Trading Consequences text mining related projects, and the Spatial Memories project. A lot of these are based around, or include development of, mobile apps.

In terms of our longer term aims we have been doing a lot of work to integrate geo services together more – including Unlock, FieldTrip GB etc. We have been making more use of available data feeds – many suppliers providing WMS now and we are trying to think about how we can use that more and most effectively. We want to make user workflows simpler and more complete. to provide solutions to common problems to do with the data life cycle. Improve utility and value of services. And improving our support resources and capabilities.

So, that’s where we’ve been, where we are going. What do you want to see next? We would welcome your comments, ideas, any wish list items about data, functionality or support that you or your users may want. Let us know now or just get in touch after today…


Q: Are there any enhancements planned for the usability of Roam – zoom, pan, selection boxes? Are you moving to a better internet-based software for that or will you be sticking with what you have now?

A – Tim: We have a project to look at this. We are planning to make it tablet friendly as that is clearly needed.

A – Guy: It’s the data too. We use Open Layer 2 toolkit, there is an Open Layer 3 version coming soon so we can see what becomes possible then.

Q: You said you can select dates for data download. One significant issue in Digimap is lack of earlier maps?

A – Tim: We have LandLine from 1995, and MasterMap from 2006/7. We hope to make that and earlier data available. We’ve been doing a lot of work to make that happen but it is more of a licensing and usability, rather than a technical challenge.

Q: Will KMLs of buildings be textured?

A – Guy: No, right now we have OS point and building heights as a CSV and we will be extruding that data to create the buildings KML, maybe other formats. Once building height information is out of alpha, OS are saying that it will be part of MasterMap as an attribute. In the meantime we’ll supply that data as the CSV but also in KMLs.

Q: Anything else happening in OpenStream?

A – Guy: Only to update the data. But we also want to look at what levels of WMS we can provide. We are certainly thinking about that sort of data and technology can be used more widely.

Tom: Do give us feedback on anything you’d like to see added – an email at any time is fine.

Q: Are there plans for other Ordnance Survey data sets to be added?

A – Guy: we are still working with OS on the new license. We ware still waiting to see what OS can make available, what we can get it for. If there are particular data sets you would like to see added do let us know and we can look into it. But if it’s not available yet, you can approach OS. And letting OS know you want some data will help encourage them to think about it being added to the data available in Digimap.

Q: I have been looking at OpenStreetMap – any plans to provide, say, Shapefile downloads from OpenStreetMap?

A – Guy: We are always looking at that type of thing. If there is demand that could be really interesting, particularly for non UK mapping.

Q: For some of the Marine Stuff, stuff across the North Sea would be really helpful…

A – Guy: the new data we have is better in terms of extent, especially the charts.

Q: With the enhancements being made to Roam, when you can print to A0, what’s happening with Carto? It used to be that Carto and Classic were very different. You used to be able to customise the maps in Carto.

A – Guy: You can do that in Roam.

A – Tom: But you can’t merge products as easily… but many of the vector map products integrate more of the products you used to have to combine. And for things like Contours we are adding those as an overlay to make it easier to print.

A – Guy: We are trying to ensure functionality of Carto is in Roam, but easier to use. And Roam has a lot of additional functionality, particularly around Legends, that neither product would have had before.

And now we will have a break. Look out for tweets from the FieldTrip GB Excursion (#geoforum2014) and join us on the blog for the second round of Q&A from the next Digimap Update session, later this afternoon.

Q&A – Second Session

Q: Have you considered European mapping?

A – Tom: It’s something that we are being asked for more and more, particularly as we have more non geographers using Digimap. Let us know if you have people come and ask about this. Other countries have good mapping agencies – Denmark for instance.

A – Guy: We are certainly getting queries for data in Germany, France, etc. And it might not be about us providing the data but being able to let the user know where the data is, how to access it, if it’s available for free.

Comment: There is a European data set, could help.

A – Guy: This can be tricky as often it’s multiple data sets with different licenses. But if you know of suitable data. When we got marine data here we did find out about other marine resources and we’ve made the links to those available – we are happy to consider adding data but we can always link to data from elsewhere too. I didn’t mention ShareGeo but that allows us to share ad hoc data sets… sometimes we create data sets and share it there as we know its tricky to obtain from other places.

Comment: I went to OS for data for my MSc – wanted data at about five levels and the 0.5 m level.

A – Guy: If you do find data sets we don’t have then do let us know – and the more people who go to the data supplier requesting a data set, the more easily we can show demand there from the sector.

Comment: I looked through ShareGeo for some data – and found some really useful stuff there. I’d recommend looking through that but also sharing your own data there.

A – Guy: We do create a lot of the data sets there, in response to queries, when we find useful data, etc. But we would love to get ShareGeo used more, to get it out into the mainstream. Would be great to get a WMS feed generating from the data there too – Shape files are useful but…

Q: Is there any chance of renegotiating for the imagery?

A – Guy: No prospect at the moment. We may try again. It is OS imagery. But there are other sources that can be looked at. And the spot imagery that will become available later this year should be open source.

Comment – Angharad Stone, CEH: We’d like to get the environmental Lidar and imagery data available – not full coverage but good quality. In the meantime you can use it free for academic use but you’d need to come to us (CEH) directly.

Closing Remarks – Tom Armitage

Firstly we would like to thank all of our speakers, all of the suppliers for providing great demos. Thank you for some fantastic questions from all of you and all of your continued support for Digimap. There will probably be a Geoforum next year, possibly further south, so keep an eye on the blog for more information as always. And all of today’s presentations will be available via the Digimap blog and possibly elsewhere on the EDINA website too.

And, with that, we close the event. Thanks for following along on the blog!

 June 19, 2014  Posted by at 9:32 am Digimap News Tagged with: ,  No Responses »
Jun 062014

Environment RoamJisc Collections have announced the pricing structure for the Environment Digimap service for the 2014-15 academic year. Environment Digimap will be the cheapest of the Digimap collections with prices ranging from £190 to £500 depending on the Jisc band of your higher education institution. There will be no charge for the service at all for further education institutions. Details of the pricing can be found in the Jisc collections catalogue entry:

Jisc Collections: Environment Digimap

Environment Digimap users have already logged in nearly 7000 times to produce over 90,000 screen maps and download over 3500 tiles of data to use in their research. The service currently supplies the three Land Cover datasets from 1990, 2000, and 2007 from the Centre for Ecology and Hydrology.

While the service provides access to a very import dataset for environmental research, we hope to continue to develop the service in line with the demands of its users. In response to feedback in the current Environment Digimap Survey we will be introducing a map information tool.  We have also already added Northern Irish data due to requests from users and hope to make the detailed 2007 Land Cover data vector data available to those who subscribe to both the Environment and Ordnance Survey Digimap collections.

If you have any questions or feedback about the service then please let us know:

Email: edina@ed.ac.ukSurvey

Tel: 0131 650 3302

Survey: https://www.survey.ed.ac.uk/environment2014

For information on subscribing or renewing your access to Environment Digimap then please go to the relevant page in the Jisc Collections Catalogue:

Jisc Collections: Environment Digimap


 June 6, 2014  Posted by at 3:39 pm Digimap News, Of Interest No Responses »