colouring-montreal/etl/README.md

105 lines
4.0 KiB
Markdown
Raw Normal View History

2022-03-09 06:48:45 -05:00
# Creating a Colouring London database from scratch
2022-03-09 08:24:27 -05:00
## Data downloading
2018-09-25 15:46:16 -04:00
The scripts in this directory are used to extract, transform and load (ETL) the core datasets
for Colouring London:
2022-03-09 08:38:57 -05:00
1. Building geometries, sourced from Ordnance Survey (OS) MasterMap (Topography Layer)
2018-09-25 15:46:16 -04:00
1. Unique Property Reference Numbers (UPRNs), sourced from Ordnance Survey AddressBase
2022-03-09 08:24:27 -05:00
To get the required datasets, you'll need to complete the following steps:
1. Sign up for the Ordnance Survey [Data Exploration License](https://www.ordnancesurvey.co.uk/business-government/licensing-agreements/data-exploration-sign-up). You should receive an e-mail with a link to log in to the platform (this could take up to a week).
2. Navigate to https://orders.ordnancesurvey.co.uk/orders and click the button for: ✏️ Order. From here you should be able to click another button to add a product.
3. Drop a rectangle or Polygon over London and make the following selections, clicking the "Add to basket" button for each:
![](screenshot/MasterMap.png)
<p></p>
![](screenshot/AddressBase.png)
4. You should be then able to check out your basket and download the files
2018-09-25 15:46:16 -04:00
## Prerequisites
2022-03-09 06:48:45 -05:00
You should already have set up PostgreSQL and created a database. Make sure to create environment variables to use `psql` if you haven't already:
2022-03-09 06:48:45 -05:00
```bash
export PGPASSWORD=<pgpassword>
export PGUSER=<username>
export PGHOST=localhost
export PGDATABASE=<colouringlondondb>
```
Create the core database tables:
```bash
2022-03-09 08:33:26 -05:00
cd ~/colouring-london
psql < migrations/001.core.up.sql
```
There is some performance benefit to creating indexes after bulk loading data.
Otherwise, it's fine to run all the migrations at this point and skip the index
creation steps below.
Install GNU parallel, this is used to speed up loading bulk data.
2022-03-09 08:38:57 -05:00
## Make data available to Ubuntu
If you didn't download the OS files to the Ubuntu machine where you are setting up your Colouring London application, you will need to make them available there. If you are using Virtualbox, you could host share a folder containing the files with the VM via a shared folder (e.g. [see these instructions for Mac](https://medium.com/macoclock/share-folder-between-macos-and-ubuntu-4ce84fb5c1ad)).
## Process and load Ordance Survey data
2018-09-25 15:46:16 -04:00
Before running any of these scripts, you will need the OS data for your area of
interest. AddressBase and MasterMap are available directly from [Ordnance
Survey](https://www.ordnancesurvey.co.uk/). The alternative setup below uses
OpenStreetMap.
2018-09-25 15:46:16 -04:00
The scripts should be run in the following order:
2018-09-29 13:29:57 -04:00
```bash
# extract both datasets
extract_addressbase.sh ./addressbase_dir
extract_mastermap.sh ./mastermap_dir
# filter mastermap ('building' polygons and any others referenced by addressbase)
filter_transform_mastermap_for_loading.sh ./addressbase_dir ./mastermap_dir
# load all building outlines
load_geometries.sh ./mastermap_dir
# index geometries (should be faster after loading)
psql < ../migrations/002.index-geometries.sql
# create a building record per outline
create_building_records.sh
# add UPRNs where they match
load_uprns.py ./addressbase_dir
# index building records
2018-10-02 16:12:46 -04:00
psql < ../migrations/003.index-buildings.sql
2018-09-29 13:29:57 -04:00
```
## Alternative, using OpenStreetMap
This uses the [osmnx](https://github.com/gboeing/osmnx) python package to get OpenStreetMap data. You will need python and osmnx to run `get_test_polygons.py`.
2018-09-29 13:29:57 -04:00
To help test the Colouring London application, `get_test_polygons.py` will attempt to save a
small (1.5km²) extract from OpenStreetMap to a format suitable for loading to the database.
In this case, run:
```bash
# download test data
python get_test_polygons.py
2018-09-29 13:29:57 -04:00
# load all building outlines
./load_geometries.sh ./
2018-09-29 13:29:57 -04:00
# index geometries (should be faster after loading)
psql < ../migrations/002.index-geometries.up.sql
2018-09-29 13:29:57 -04:00
# create a building record per outline
./create_building_records.sh
2018-09-29 13:29:57 -04:00
# index building records
psql < ../migrations/003.index-buildings.up.sql
2018-09-29 13:29:57 -04:00
```
## Finally
2022-03-09 06:48:45 -05:00
Run the remaining migrations in `../migrations` to create the rest of the database structure.
# Updating the Colouring London database with new OS data