209 lines
7.2 KiB
Markdown
209 lines
7.2 KiB
Markdown
# Extract, transform and load
|
|
|
|
The scripts in this directory are used to extract, transform and load (ETL) the core datasets for Colouring London. This README acts as a guide for setting up the Colouring London database with these datasets and updating it.
|
|
|
|
# Contents
|
|
|
|
- :arrow_down: [Downloading Ordnance Survey data](#arrow_down-downloading-ordnance-survey-data)
|
|
- :penguin: [Making data available to Ubuntu](#penguin-making-data-available-to-ubuntu)
|
|
- :new_moon: [Creating a Colouring London database from scratch](#new_moon-creating-a-colouring-london-database-from-scratch)
|
|
- :full_moon: [Updating the Colouring London database with new OS data](#full_moon-updating-the-colouring-london-database-with-new-os-data)
|
|
|
|
# :arrow_down: Downloading Ordnance Survey data
|
|
|
|
The building geometries are sourced from Ordnance Survey (OS) MasterMap (Topography Layer). We also make use of OS Open TOID data which proves access to a generalised location for those geometries.
|
|
|
|
## Downloading MasterMap data
|
|
|
|
1. Sign up for the Ordnance Survey [Data Exploration License](https://www.ordnancesurvey.co.uk/business-government/licensing-agreements/data-exploration-sign-up). You should receive an e-mail with a link to log in to the platform (this could take up to a week).
|
|
2. Navigate to https://orders.ordnancesurvey.co.uk/orders and click the button for: ✏️ Order. From here you should be able to click another button to add a product.
|
|
3. Drop a rectangle or Polygon over London and make the following selections, clicking the "Add to basket" button for each:
|
|
|
|
![](screenshot/MasterMap.png)
|
|
<p></p>
|
|
|
|
4. You should be then able to check out your basket and download the files. Note: there may be multiple `.zip` files to download for MasterMap due to the size of the dataset.
|
|
6. Unzip the MasterMap `.zip` files and move all the `.gz` files from each to a single folder in a convenient location. We will use this folder in later steps.
|
|
|
|
## Downloading OS Open TOID data
|
|
|
|
1. Navigate to the download page at https://osdatahub.os.uk/downloads/open/OpenTOID
|
|
2. Select the area of the map you require location data for (e.g. the squares covering London) and download the data in CSV format:
|
|
|
|
![](screenshot/OpenTOID.png)
|
|
|
|
3. Unzip the `.zip` file(s) to get the CSV files and move them to a single folder in a convenient location. We will use this folder in later steps.
|
|
|
|
# :penguin: Making data available to Ubuntu
|
|
|
|
Before creating or updating a Colouring London database, you'll need to make sure the downloaded OS files are available to the Ubuntu machine where the database is hosted. If you are using Virtualbox, you could host share folder(s) containing the OS files with the VM (e.g. [see these instructions for Mac](https://medium.com/macoclock/share-folder-between-macos-and-ubuntu-4ce84fb5c1ad)).
|
|
|
|
# :new_moon: Creating a Colouring London database from scratch
|
|
|
|
## Prerequisites
|
|
|
|
You should already have set up PostgreSQL and created a database in an Ubuntu environment. Make sure to create environment variables to use `psql` if you haven't already:
|
|
|
|
```bash
|
|
export PGPASSWORD=<pgpassword>
|
|
export PGUSER=<username>
|
|
export PGHOST=localhost
|
|
export PGDATABASE=<colouringlondondb>
|
|
```
|
|
|
|
Create the core database tables:
|
|
|
|
```bash
|
|
cd ~/colouring-london
|
|
psql < migrations/001.core.up.sql
|
|
```
|
|
|
|
There is some performance benefit to creating indexes after bulk loading data.
|
|
Otherwise, it's fine to run all the migrations at this point and skip the index
|
|
creation steps below.
|
|
|
|
You should already have installed GNU parallel, which is used to speed up loading bulk data.
|
|
|
|
## Processing and loading Ordnance Survey data
|
|
|
|
Move into the `etl` directory and set execute permission on all scripts.
|
|
|
|
```bash
|
|
cd ~/colouring-london/etl
|
|
chmod +x *.sh
|
|
```
|
|
|
|
Extract the MasterMap data (this step could take a while).
|
|
|
|
```bash
|
|
sudo ./extract_mastermap.sh /path/to/mastermap_dir
|
|
```
|
|
|
|
Filter MasterMap 'building' polygons.
|
|
|
|
```bash
|
|
sudo ./filter_transform_mastermap_for_loading.sh /path/to/mastermap_dir
|
|
```
|
|
|
|
Load all building outlines. Note: you should ensure that `mastermap_dir` has permissions that will allow the linux `find` command to work without using sudo.
|
|
|
|
```bash
|
|
./load_geometries.sh /path/to/mastermap_dir
|
|
```
|
|
|
|
Index geometries.
|
|
|
|
```bash
|
|
psql < ../migrations/002.index-geometries.up.sql
|
|
```
|
|
|
|
<!-- TODO: Drop outside limit. -->
|
|
|
|
<!-- ```bash
|
|
./drop_outside_limit.sh /path/to/boundary_file
|
|
```` -->
|
|
|
|
Create a building record per outline.
|
|
|
|
```bash
|
|
./create_building_records.sh
|
|
```
|
|
|
|
Run the remaining migrations in `../migrations` to create the rest of the database structure.
|
|
|
|
```bash
|
|
ls ~/colouring-london/migrations/*.up.sql 2>/dev/null | while read -r migration; do psql < $migration; done;
|
|
```
|
|
|
|
# :full_moon: Updating the Colouring London database with new OS data
|
|
|
|
In the Ubuntu environment where the database exists, set up environment variable to make the following steps simpler.
|
|
```bash
|
|
export PGPASSWORD=<pgpassword>
|
|
export PGUSER=<username>
|
|
export PGHOST=localhost
|
|
export PGDATABASE=<colouringlondondb>
|
|
```
|
|
|
|
Move into the `etl` directory and set execute permission on all scripts.
|
|
|
|
```bash
|
|
cd ~/colouring-london/etl
|
|
chmod +x *.sh
|
|
```
|
|
|
|
Extract the new MasterMap data (this step could take a while).
|
|
|
|
```bash
|
|
sudo ./extract_mastermap.sh /path/to/mastermap_dir
|
|
```
|
|
|
|
Filter MasterMap 'building' polygons.
|
|
|
|
```bash
|
|
sudo ./filter_transform_mastermap_for_loading.sh /path/to/mastermap_dir
|
|
```
|
|
|
|
Load all new building outlines. This step will only add geometries that are not already present (based on the `TOID`). Note: you should ensure that `mastermap_dir` has permissions that will allow the linux `find` command to work without using sudo.
|
|
|
|
```bash
|
|
./load_geometries.sh /path/to/mastermap_dir
|
|
```
|
|
|
|
Create a virtual environment for python in the `etl` folder of your repository. In the following example we have name the virtual environment *colouringlondon* but it can have any name.
|
|
|
|
```bash
|
|
pyvenv colouringlondon
|
|
```
|
|
|
|
Activate the virtual environment so we can install python packages into it.
|
|
|
|
```bash
|
|
source colouringlondon/bin/activate
|
|
```
|
|
|
|
Install python pip package manager and related tools.
|
|
|
|
```bash
|
|
pip install --upgrade pip
|
|
pip install --upgrade setuptools wheel
|
|
```
|
|
|
|
Install the required python packages.
|
|
|
|
```bash
|
|
pip install -r requirements.txt
|
|
```
|
|
|
|
Converting the OS Open TOID data OSGB36 Eastings and Northings to WGS84 longitude and latitude coordinates.
|
|
|
|
<details>
|
|
<summary>
|
|
WARNING
|
|
</summary>
|
|
EC note: When testing this in my developemt setup, due to memory constraints I actually ended up running this Python step on my Mac directly, rather than in my Ubuntu Virtualbox environment.
|
|
</details><p></p>
|
|
|
|
```bash
|
|
python convert_opentoid_bng_latlon.py /path/to/opentoids_dir
|
|
```
|
|
|
|
Assign latitude and longitude to buildings with the converted OS Open TOID data.
|
|
|
|
```bash
|
|
./load_coordinates.sh /path/to/opentoids_dir
|
|
```
|
|
|
|
Create building record to match each new geometry (TOID) that doesn't already have a linked building. Ignores geometries in the same location as pre-existing ones based on lat/lon coordinates, in which case marking the old one as valid _up to_ the current date and adding the new one as valid _from_ today, keeping both linked to the existing building record.
|
|
|
|
```bash
|
|
./update_building_records.sh
|
|
```
|
|
|
|
Mark TOIDs not present in the update as demolished.
|
|
|
|
<!-- TODO: need the open_toids table for this -->
|
|
```bash
|
|
./mark_demolitions.sh
|
|
```
|