colouring-montreal/etl
2020-06-18 13:00:53 +01:00
..
join_building_data Add debug, no overwrite flags 2020-06-16 16:16:46 +01:00
check_ab_mm_match.py Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00
create_building_records.sh Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00
drop_outside_limit.sh Skip altering foreign key restrictions 2018-10-04 19:00:56 +01:00
extract_addressbase.sh Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00
extract_mastermap.sh Boundary file not needed in initial extraction 2018-10-04 19:01:17 +01:00
filter_addressbase_csv.py Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00
filter_mastermap.py Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00
filter_transform_mastermap_for_loading.sh Fix sed quoting 2018-10-03 20:10:27 +01:00
get_test_polygons.py Fix use of osmnx to work with v0.14 2020-06-18 10:31:34 +01:00
load_geometries.sh Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00
load_postcodes.sh Set postcode zoom and class 2019-02-11 09:07:26 +00:00
load_uprns.sh Ensure index exists for uprn link 2018-10-04 21:13:25 +01:00
README.md Update etl/migrations docs with a little more on prerequisites 2020-06-18 10:32:35 +01:00
requirements.txt Fix typo in etl requirements.txt 2020-06-18 13:00:39 +01:00
run_all.sh Comment sections in etl run script 2018-10-21 20:47:31 +01:00
run_clean.sh Update etl to load UPRNs to table 2018-10-02 21:12:46 +01:00

Data loading

The scripts in this directory are used to extract, transform and load (ETL) the core datasets for Colouring London:

  1. Building geometries, sourced from Ordnance Survey MasterMap (Topography Layer)
  2. Unique Property Reference Numbers (UPRNs), sourced from Ordnance Survey AddressBase

Prerequisites

Install PostgreSQL and create a database for colouringlondon, with a database user that can connect to it. The PostgreSQL documentation covers installation and getting started.

Install the PostGIS extension.

Connect to the colouringlondon database and add the PostGIS, pgcrypto and pg_trgm extensions:

create extension postgis;
create extension pgcrypto;
create extension pg_trgm;

Create the core database tables:

psql < ../migrations/001.core.up.sql

There is some performance benefit to creating indexes after bulk loading data. Otherwise, it's fine to run all the migrations at this point and skip the index creation steps below.

Install GNU parallel, this is used to speed up loading bulk data.

Process and load Ordance Survey data

Before running any of these scripts, you will need the OS data for your area of interest. AddressBase and MasterMap are available directly from Ordnance Survey. The alternative setup below uses OpenStreetMap.

The scripts should be run in the following order:

# extract both datasets
extract_addressbase.sh ./addressbase_dir
extract_mastermap.sh ./mastermap_dir
# filter mastermap ('building' polygons and any others referenced by addressbase)
filter_transform_mastermap_for_loading.sh ./addressbase_dir ./mastermap_dir
# load all building outlines
load_geometries.sh ./mastermap_dir
# index geometries (should be faster after loading)
psql < ../migrations/002.index-geometries.sql
# create a building record per outline
create_building_records.sh
# add UPRNs where they match
load_uprns.py ./addressbase_dir
# index building records
psql < ../migrations/003.index-buildings.sql

Alternative, using OpenStreetMap

This uses the osmnx python package to get OpenStreetMap data. You will need python and osmnx to run get_test_polygons.py.

To help test the Colouring London application, get_test_polygons.py will attempt to save a small (1.5km²) extract from OpenStreetMap to a format suitable for loading to the database.

In this case, run:

# download test data
python get_test_polygons.py
# load all building outlines
./load_geometries.sh ./
# index geometries (should be faster after loading)
psql < ../migrations/002.index-geometries.up.sql
# create a building record per outline
./create_building_records.sh
# index building records
psql < ../migrations/003.index-buildings.up.sql

Finally

Run the remaining migrations in ../migrations to create the rest of the database structure.