.. | ||
join_building_data | ||
check_ab_mm_match.py | ||
create_building_records.sh | ||
drop_outside_limit.sh | ||
extract_addressbase.sh | ||
extract_mastermap.sh | ||
filter_addressbase_csv.py | ||
filter_mastermap.py | ||
filter_transform_mastermap_for_loading.sh | ||
get_test_polygons.py | ||
load_geometries.sh | ||
load_postcodes.sh | ||
load_uprns.sh | ||
README.md | ||
requirements.txt | ||
run_all.sh | ||
run_clean.sh |
Creating a Colouring London database from scratch
Data loading
The scripts in this directory are used to extract, transform and load (ETL) the core datasets for Colouring London:
- Building geometries, sourced from Ordnance Survey MasterMap (Topography Layer)
- Unique Property Reference Numbers (UPRNs), sourced from Ordnance Survey AddressBase
Prerequisites
You should already have set up PostgreSQL and created a database. Make sure to create environment variables to use psql
if you haven't already:
export PGPASSWORD=<pgpassword>
export PGUSER=<username>
export PGHOST=localhost
export PGDATABASE=<colouringlondondb>
Create the core database tables:
psql < ../migrations/001.core.up.sql
There is some performance benefit to creating indexes after bulk loading data. Otherwise, it's fine to run all the migrations at this point and skip the index creation steps below.
Install GNU parallel, this is used to speed up loading bulk data.
Process and load Ordance Survey data
Before running any of these scripts, you will need the OS data for your area of interest. AddressBase and MasterMap are available directly from Ordnance Survey. The alternative setup below uses OpenStreetMap.
The scripts should be run in the following order:
# extract both datasets
extract_addressbase.sh ./addressbase_dir
extract_mastermap.sh ./mastermap_dir
# filter mastermap ('building' polygons and any others referenced by addressbase)
filter_transform_mastermap_for_loading.sh ./addressbase_dir ./mastermap_dir
# load all building outlines
load_geometries.sh ./mastermap_dir
# index geometries (should be faster after loading)
psql < ../migrations/002.index-geometries.sql
# create a building record per outline
create_building_records.sh
# add UPRNs where they match
load_uprns.py ./addressbase_dir
# index building records
psql < ../migrations/003.index-buildings.sql
Alternative, using OpenStreetMap
This uses the osmnx python package to get OpenStreetMap data. You will need python and osmnx to run get_test_polygons.py
.
To help test the Colouring London application, get_test_polygons.py
will attempt to save a
small (1.5km²) extract from OpenStreetMap to a format suitable for loading to the database.
In this case, run:
# download test data
python get_test_polygons.py
# load all building outlines
./load_geometries.sh ./
# index geometries (should be faster after loading)
psql < ../migrations/002.index-geometries.up.sql
# create a building record per outline
./create_building_records.sh
# index building records
psql < ../migrations/003.index-buildings.up.sql
Finally
Run the remaining migrations in ../migrations
to create the rest of the database structure.