Framework project: Digital Elevation Models (DEMs)
Process chain for Copernicus DEM
Introduction
Processing a global DEM all the way from downloading tiles to create coherent landscape indices and metrics, is a typical task for which Karttur’s GeoImagine Framework was built. Once the DEM is imported and organized within the Framework, it is easy to test different algorithms and visualizations. If you have access to some ground data you can also apply the Framework for comparing the results of different algorithms and parameters against ground data, and thus select the most appropriate landscape model.
Prerequisits
You must have installed Karttur’s GeoImagine Framework. You also need a registered project that covers the geographic regions of the DEM you want to process. The latter can, however, be done as part of the project itself.
Overview
This post summarises the processes chain for Copernicus DEM in Karttur’s GeoImagine Framework. Each step in the process chain is explained in more detail in linked posts. The Copernicus DEM in 90 m spatial resolution is used as an example.
A note on hydrological corrections
Before running the DEM processing you need to decide how you want to manage hydrological errors in your DEM. If there are no errors, then you do not need to worry, but usually there are at least both pits and flat area. Both of these problems primarily affects the extraction of river basins. Different algorithms handle these problems differently. The simplest solution is to fill up all pits completely, with or without tilting or “burn in” the flow across flat surfaces. This, however, causes the, normally a-priori known, relative DEM accuracy to be replaced by an unknown error that might, or might not, have spatial or contextual biases. If the DEM is going to be solely used for flow routing this is perhaps the best method.
If there are independent vector datasets on stream networks and channels available, you can also burn a drainage pattern into the DEM. The geometric accuracy must match that of the DEM, and the vector network contiguity must be correct. For wide rivers and lakes, burning a channel also causes problems with a preferential flow line with subsequent errors when calculating e.g. flooding and evapotranspiration.
Another alternative is to only fill up minor errors and errors that can be logically identified. The latter category would then include depressed channels and negative elevations along shorelines - and this is how it is done in this manual. But then you must also apply a flow routing algorithm that can pass depressions on the fly. The GRASS GIS module r.watershed does just that. And r.watershed is used for the DEM modelling in this post. To fill up all depressions, you can instead use the GRASS GIS module r.terraflow, or any of the DEM filling algorithms of SAGA GIS. If you apply hydrological DEM corrections, you must also decide which DEM version (corrected or non-corrected) to use for extracting other indexes, like slope, curvature etc.
Process chain
The process chain is available at Kartturs GitHub account and also found under the Hide/show button below. The process chain is an ordinary text file linking to a series of json files. Empty rows and rows starting with a “#” are ignored. When called from Karttur’s GeoImagine Framework, the processes of each json file are sequentially run. In the process chain below you can click on each json file name to see the complete content. The following sections also explain in more detail the processes of each json command file.
To produce symbolised and labeled maps using the Framework, you have to define some Layout features and parameters. All map layout is done using unsigned byte range maps with values ranging from 0 to 255, Values 251 to 255 are reserved for specific purposes and not allowed for representing map values. The Framework Layout processes include:
The CreateScaling process converts any input raster into an unsigned byte (0-255) range file. The byte file is then used for assigning a color ramp (defined in the process AddRasterPalette, see next section).
Scaling is thus not defined on the fly, instead each map composition must be associated with a predefined scaling. Once defined, this scaling can not be changed. You can, however, change the color ramping by setting another raster palette.
In contrast to scaling, a palette can be freely set for any map composition (but always using the same scaling). The palette to use for any map layout is defined when calling the process that generates the layout. Thus the palette must exist beforehand, created by using the process AddRasterPalette.
With the scaling set and a palette defined, you can define a legend - that is how the combined scaling and palette will be presented in graphical layouts.
Once created you can export a legend. All legends are exported as png, csv and pdf. Exporting a legend requires that you have installed the application inkscape as outlined in the post Python virtual environment. The exported legends are stored under a director legends on the destination disk/volume. The image represented the raster palette is under a subfolder images (saved as a png image) and the legends with text added are under a subfolder layouts, and saved as three different versions:
Data specific processing requires that your user has a predefined project that includes a predefined tract, where the tractmust be linked to a default region. All spatial processing that you process from that user and with that project will then operate on system tiles linked to parent default region of your project tract. The whole processing is set up in thin manner to minimise double processing.
Unless you have done so before you must setup the project and tract to use for processing the Copernicus DEM data. The example is for processing the DEM for the entire basins draining into Arctic waters. Thus the example assign the default region northlandease2n as the parent region. Both the user specific project and its region are called karttur-northlandease2n.
Search, download and unzip
Search and download differ for almost all datasets. Details for searching and downloading Copernicus DEM (and other DEMs) is in the post Digital Elevation Models (DEMs). Unzipping is only required if the retrieved data is in a compressed zip format.
The search for the Copernicus DEM 90 m product is done using the special process SearchCopernicusProducts. The search requires that you installed wget, see the post on Python virtual environment, that also covers installing wget.
Also downloading the Copernicus DEM 90 m version is defined with a customized process, DownloadCopernicus. This process reads the html file above (copernicusDEM90.html), extracts the url tag (<a>), looks for a local copy of the DEM tile, and if not found (or overwrite set to true) downloads the tile. The downloading is predefined to a path on the destination disk and can not be changed. If you want to store that data somewhere else you have to move them manually, but the system will then lose track of them. The original tiles are not registered in the database. It is thus recommended that you leave the original data in the predefined location (for the data used here the default path is /DISKPATH/DOWNLOADS/CopernicusDem90).
Downloaded data that is retrieved as zip files can be unzipped with the command UnZipRawData. This is a generic process, and you need to give the full path to both the source (zip) and target (expanded) datasets. The disk (volume) paths are as usual given through the parameters srcpath and dstpath, but additioally you must also give the directories on the disks for both srcdir and dstdir under parameters. If srcdir is a file, the script assumes that it is a list of the files to unzip; if srcdir is a directory the script will explode all zip files in that directory.
For the example here I created a file listing all the zip files to explode (CopernicusDem90_rawtiles.csv). In MacOS or Linux you create the csv list by piping the ls command to a file:
The retrieved data is not yet organized as a dataset in the Framework database. A tiled dataset, like CopDEM, must be prepared as a virtual full coverage dataset prior to importing (organization) and then retiling into one, or more, projection system(s) that are used for the actual spatial data processing in the Framework. For the CopDEM dataset, the sequence of processes thus becomes:
identification of raw tile bounding boxes (optional),
The process BBoxTiledRawData creates bounding boxes for each raw tile. In the example used here, each tile to include is listed in a csv file, created in a similar way as the list of zip files in the previous section. Running this process is optional, and intended for allowing a visual inspection of the raw data source.
The process generates a single vector data source with all the included tiles.
As clearly seen in the figure above, the raw tiles are of arbitrary sizes with geographical coverage restricted to land. The next steps in the process chain converts these more arbitrary tiles to regular grids fitted to the projection system where you want to use the DEM.
Downloaded data that are delivered as arbitrary tiles need to be mosaicked together before importing to the Framework. (SMAP, MODIS and Sentinel data do not need mosaicking of raw tiles as these datasets either are downloaded in predefined tiling systems or as complete global datasets). The mosaicking must be done as virtual (.vrt) dataset. To actually create a true global mosaic at e.g 90 m, as in this example, would take a long time and also take up unnecessary much disk space. The raw tile mosaic will represent the raw source data imported (organized) into the Framework database. The process MosaicAncillary does not register the imported (organized in the terminology of the Framework) dataset into the database, that is done in the next step.
With the Copernicus data downloaded, unzipped and mosaicked, the data can be organized for (or imported to) the Framework. For the Copernicus DEM data, the process OrganizeAncillary as defined in the json object below only imports the virtual mosaic linked to the original (downloaded and unzipped) tiles.
Once you have run the json command, you can check in the Framework database, and you should find that the ancillary schema contains a layer with the compiddem_copdem. Apart from being defined as an individual layer, the dataset is also registered as a product (table: compprod) and a composition (table: compdef):
SELECT * FROM ancillary.layer WHERE compid = 'dem_copdem';
SELECT * FROM ancillary.compprod WHERE compid = 'dem_copdem';
SELECT * FROM ancillary.compdef WHERE compid = 'dem_copdem';
All basic processing in the Framework is done using predefined tiling systems. The process TileAncillaryRegion constructs tiles while also reprojecting the source (ancillary) data to the projection system defined under userproject: system. For the CopDEM data in this example, the data are reprojected and tiled for the ease2n (EASE GRID 2.0 North) projection system.
Expanded virtual mosaic tiles
Spatial processing applying seamless tiles causes edge effects for kernel and regional based processes. To overcome that the Framework can use virtually expanded tiles as input in some processes. The process for creating virtually expanded tiles is MosaicAdjacentTiles.
The first hydrological correction is the filling up of all single cell pits and the flattening of all single cell peaks adjacent to streams. The Framework process to apply is GrassDemFillDirTiles, which is a wrapper for the GRASS GIS module r.fill.dir.
Adjacent tile mosaic the single cell corrected DEM
Create expanded virtual tiles of the single cell corrected DEM tiles for use in the next step. See the post on Mosaics for avoiding edge effects for details.
The second hydrological correction is the filling up of larger pits in streams. The Framework process is GrassDemHydroDemTiles, which is a wrapper for the GRASS GIS addon module r.hydrodem.
Create expanded virtual tiles of the single cell corrected DEM tiles for use in the next step. See the post on Mosaics for avoiding edge effects for details.
The river mouth and shoreline correction processing fills up all negative pits adjacent to the sea. No special process is used, instead the generic Framework wrapper for GRASS GIS is used by defining a sequence of GRASS commands under the process GrassOnetoManyTiles.
GRASS script
The above process generates a script file that loops over all the tiles belonging to the defined region. An example is shown under the Hide/show button below, the commented lines (starting with “#”) have been added for explanation.
# Change mapset to the present tile
g.mapset -c mapset=x04y07
# Import the original tile (always done when using virtual mosaics)
r.in.gdal input=/Volumes/Ancillary/ease2n/ESA/tiles/dem/x04y07/0/dem_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif output=originalTile --overwrite
# Import the virtual mosaic DEM
r.in.gdal -e input=/Volumes/Ancillary/ease2n/ESA/tiles/dem/x04y07/0/dem_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m-full.vrt output=DEM --overwrite
# Set region to the mosaic (expanded) DEM
g.region raster=DEM --overwrite
# Create Landdem by setting all cells==0 to null()
r.mapcalc "landDEM = if((DEM==0),null(),DEM)" --overwrite
# Run statistics on landdem and capture the return with the variable "data"
data="$(r.stats -p input=landDEM null_value='null')"
# If the statistics incudes null, the (expanded) tile includes a coastline
if echo "$data" | grep -q "null"; then
echo "Coastline tile"
# extract all cells at sea level
r.mapcalc "drain_terminal = if((DEM==0),1,null())" --overwrite
# Clump all cells at sea level
r.clump input=drain_terminal output=terminal_clumps --overwrite
# Convert raster clumps to vectors
r.to.vect input=terminal_clumps output=terminal_clumps type=area --overwrite
# Add db record for clump vector area
v.to.db map=terminal_clumps type=centroid option=area columns=area_km2 units=kilometers --overwrite
# Export clumps to ESRI shapefle (for inspection if needed)
v.out.ogr input=terminal_clumps type=area format=ESRI_Shapefile output=/Volumes/Ancillary/ease2n/ESA/tiles/terminal-polys/x04y07/0/terminal-polys_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.shp --overwrite
# Rasterize clumps of cells at sea level if smaller than 5 km2, these areas are not regarded as the sea
v.to.rast input=terminal_clumps type=area where="area_km2< 5.0" output=drain_terminal_small use=val value=0 --overwrite
# Rasterize clumps of cells at sea level if larger than 5 km2, these areas represent the sea
v.to.rast input=terminal_clumps type=area where="area_km2>= 5.0" output=drain_terminal_large use=val value=0 --overwrite
# Construct a layer of all land
r.mapcalc "land = if(( isnull(drain_terminal_large)), 1 ,null())" --overwrite
# Buffer a distance of 1 diagonal cell from the sea surface cells
r.buffer input=drain_terminal_large output=shorep1 distance=128.56 units=meter --overwrite
# Get the initial shoreline
r.mapcalc "shoreline_ini = if((shorep1>0 && land==1),1,null())" --overwrite
# retrieve all areas below sea level
r.mapcalc "negative_DEM = if((DEM<0),0,null())" --overwrite
# Get shorelines under sea level
r.mapcalc "negativeshoreline_DEM = if((DEM<0 && shoreline_ini >0),1,null())" --overwrite
# From the shorelines under sea level find all sea connected cells under sea level
r.cost -n input=negative_DEM output=sea_pits start_raster=negativeshoreline_DEM max_cost=0 --overwrite
# Reclass pis in the sea surface to null (the sea itself)
r.mapcalc "inland_comp_DEM = if(( isnull(sea_pits) ), if((isnull(land)), null(),DEM) , null())" --overwrite
# Get the final ocean mask
r.mapcalc "terminal_drain = if((isnull (inland_comp_DEM)),1,null())" --overwrite
# Buffer a distance of 1 diagonal cell from the final sea surface mask
r.buffer input=terminal_drain output=shorep1 distance=128.56 units=meter --overwrite
# find the shoreline (again) from the final ocean mask
r.mapcalc "shoreline = if((shorep1>0 && inland_comp_DEM>0),1,null())" --overwrite
else
# No coastline, just get the inland_comp_DEM as the landDEM
r.mapcalc "inland_comp_DEM = landDEM" --overwrite
fi
# Set region to core tile prior to exporting raster data (automatic)
g.region raster=originalTile
# Export final inland composition DEM
r.out.gdal input=inland_comp_DEM output=/Volumes/Ancillary/ease2n/ESA/tiles/landdem/x04y07/0/land-dem_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
Kernel DEM derivates
With Karttur’s GeoImagine Framework, kernel (filter) indexes derived from DEM data can be calculated using GDAL, GRASS GIS, SAGA GIS or using internal processes.
The Framework process GdalDemTiles is a wrapper for running the Geographic Data Abstraction Library (GDAL) DEM (gdaldem) program.
Framework DEM derivatives (NumpyDemTiles)
0303_NumpyDemTiles_CopDEM-90m.json
The Framework process for internal DEM derivates is focusing of calculating Terrain Position Index [TPI] landforms and can thus only retrieve the derivates for slope and TPI plus the landform categogization. The TPI extraction and thus the landform categorisation is, however, more advanced (thetorically more correct) compared to most other solutions.
GRASS GIS has more options for retrieving DEM derivates than either GDAL or the internal Framework process. GRASS modules for retrieving DEM derivatives include: r.slope.aspect, r.params.scale, r.tpi, r.tri and r.geomorphon.
Retrieving DEM derivates using GRASS GIS though the Framework is done through the standard GRASS wrapper GrassOnetoManyTiles.
The above process generates a script file that loops over all the tiles belonging to the defined region. An example is shown under the Hide/show button below, the commented lines (starting with “#”) have been added for explanation.
# Change mapset to the present tile
g.mapset -c mapset=x04y07
# Import the original tile (always done when using virtual mosaics)
r.in.gdal input=/Volumes/Ancillary/ease2n/ESA/tiles/dem/x04y07/0/dem_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif output=originalTile --overwrite
# Set region to the mosaic (expanded) DEM
g.region raster=DEM --overwrite
# Creating an "empty" raster for hydrological blocking (no blocking)
r.mapcalc "blocking = 0" --overwrite
# Run r.watershed using Multiple Flow Direction and outputs for all hillslope data required
r.watershed -a elevation=inland_comp_DEM max_slope_length=750 blocking=blocking accumulation=MFDupstream drainage=MFDflowdir stream=MFDstream tci=MFDtci spi=MFDspi length_slope=MFDslopelength slope_steepness=MFDslopesteepness threshold=617 memory=4000 --overwrite
# Run r.watershed using Single Flow Direction
r.watershed -as elevation=inland_comp_DEM accumulation=SFDupstream threshold=617 memory=4000 --overwrite
# Extract streams as both raster and vectors from the MFD watershed analysis
r.stream.extract elevation=inland_comp_DEM accumulation=MFDupstream threshold=247 mexp=1.2 stream_length=4 stream_rast=extractstream stream_vector=extractstreamvect direction=extractflowdir memory=4000 --overwrite
# Analyse the distance and elevation difference for all cells to the nearest stream
r.stream.distance stream_rast=extractstream direction=extractflowdir elevation=inland_comp_DEM distance=streamproximity difference=hydhead method=downstream memory=4000 --overwrite
# Analyse the distance and elevation difference for all cells to the nearest waterhsed divide
r.stream.distance -n stream_rast=extractstream direction=extractflowdir elevation=inland_comp_DEM distance=neardividehorizontal difference=neardividevertical method=upstream memory=4000 --overwrite
# Analyse the distance and elevation difference for all cells to the farthest waterhsed divide
r.stream.distance stream_rast=extractstream direction=extractflowdir elevation=inland_comp_DEM distance=fardividehorizontal difference=fardividevertical method=upstream memory=4000 --overwrite
# Export all hillslope related indexes crated above
# For each exort ("r.out.gdal") the region is automaticaly changed to the core tile region and then reset.
# Once exported, most of the GRASS internal source files are removed (deleted)
g.region raster=originalTile
r.out.gdal -f input=MFDflowdir type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/mfd-flowdir_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.region raster=originalTile
r.out.gdal -f input=MFDupstream type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/mfd-updrain_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.region raster=originalTile
r.out.gdal -f input=SFDupstream type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/sfd-updrain_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.region raster=originalTile
r.out.gdal -f input=MFDtci type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/tci_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=MFDtci --overwrite
g.region raster=originalTile
r.out.gdal -f input=MFDspi type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/spi_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=MFDspi --overwrite
g.region raster=originalTile
r.out.gdal -f input=MFDslopelength type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/rusle-slopelength_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=MFDslopelength --overwrite
g.region raster=originalTile
r.out.gdal -f input=MFDslopesteepness type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/rusle-slopesteepness_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=MFDslopesteepness --overwrite
g.region raster=originalTile
r.out.gdal -f input=streamproximity type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/stream-dist_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=streamproximity --overwrite
g.region raster=originalTile
r.out.gdal -f input=hydhead type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/hydraulhead_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=hydhead --overwrite
g.region raster=originalTile
r.out.gdal -f input=neardividehorizontal type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/near-divide-dist_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=neardividehorizontal --overwrite
g.region raster=originalTile
r.out.gdal -f input=neardividevertical type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/near-divide-head_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=neardividevertical --overwrite
g.region raster=originalTile
r.out.gdal -f input=fardividehorizontal type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/far-divide-dist_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=fardividehorizontal --overwrite
g.region raster=originalTile
r.out.gdal -f input=fardividevertical type=Float32 output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/far-divide-head_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.remove -f type=raster name=fardividevertical --overwrite
g.region raster=originalTile
r.out.gdal -f input=extractstream output=/Volumes/Ancillary/ease2n/ESA/tiles/terrain/x04y07/0/stream-rast_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif --overwrite
g.region raster=DEM
g.mapset -c mapset=x04y08
r.in.gdal input=/Volumes/Ancillary/ease2n/ESA/tiles/dem/x04y08/0/dem_copdem_x04y08_0_v01-pfpf-hydrdem4+4-90m.tif output=originalTile --overwrite
g.region raster=DEM --overwrite
One (additional) problem with extrating river basins is the definition of outlets points, a problem unrelated to the problem of hydrological flow routing. This particualr problem is, however, of relevance only if you intend to use the basin dat for hydrolocail modeling. If you only intend to use the basin map for extracting spatial statistics, you do not need an un-ambiguosly identifed outlet point. This part is for creating a distinct, single cell, outlet point for basins. It maske use of GRASS GIS and the Framework wrapper GrassOnetoManyTiles.
The above process generates a script file that loops over all the tiles belonging to the defined region. An example is shown under the Hide/show button below, the commented lines (starting with “#”) have been added for explanation.
# Change mapset to the present tile
g.mapset -c mapset=x04y07
# Import the original tile (always done when using virtual mosaics)
r.in.gdal input=/Volumes/Ancillary/ease2n/ESA/tiles/dem/x04y07/0/dem_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.tif output=originalTile --overwrite
# Set region to the original (smaller) tile before checking for basin outlets
g.region raster=originalTile --overwrite
# Check if there is any coastline in this tile
data="$(r.stats -p input=landDEM null_value='null')"
# If the tile include a coasline
if echo "$data" | grep -q "null"; then
echo "Coastline tile"
# Set region to the mosaic (expanded) DEM
g.region raster=DEM --overwrite
# run r.watershed Multiple Flow Directions (MFD) (if this is already done, this line can be commented out)
r.watershed -a elevation=inland_comp_DEM accumulation=MFDupstream drainage=MFDflowdir threshold=617 memory=4000 --overwrite
# run r.watershed Single Flow Directions (SFD) (if this is already done, this line can be commented out)
r.watershed -as elevation=inland_comp_DEM accumulation=SFDupstream threshold=617 memory=4000 --overwrite
# Set region to the original (smaller)
g.region raster=originalTile --overwrite
# Extract shorelines with SFD basins larger than a threshold
r.mapcalc "basin_SFD_outlets = if((shoreline > 0), if ((SFDupstream >= 617),SFDupstream,null() ) , null())" --overwrite
# Extract shorelines with SFD basins larger than a threshold
r.mapcalc "basin_MFD_outlets = if((shoreline > 0), if ((MFDupstream >= 617),MFDupstream,null() ) , null())" --overwrite
# Combnine SFD and MFD shoreline outlets
r.mapcalc "basin_shoreline_outlets = if ((isnull(basin_SFD_outlets)), basin_MFD_outlets, basin_SFD_outlets)" --overwrite
# Vectorize the outlet points
r.to.vect input=basin_shoreline_outlets output=basin_outlet_pt type=point --overwrite
# Set region to the mosaic (expanded) DEM
g.region raster=DEM --overwrite
# Use mapcalc to find all low level cells
r.mapcalc "lowlevel_DEM = if((DEM <= 1), 0, null())" --overwrite
# Costgrow from identifed output cells over lowlevel areas to find cells that belong to the same basin but are separated
r.cost input=lowlevel_DEM output=lowlevel_outlet_costgrow start_points=basin_outlet_pt max_cost=0 mem=4000 --overwrite
# Clump the costgrow analysis to create uniqure ids for each basin
r.clump -d input=lowlevel_outlet_costgrow output=lowlevel_outlet_clumps --overwrite
# Create a DEM overt just the shoreline
r.mapcalc "shoreline_DEM = if((shoreline > 0 && DEM <= 0), 0, null())" --overwrite
# Costgrow along the soreline to find all adjacent outlet cell belonging to the same basin
r.cost input=shoreline_DEM output=shoreline_outlet_costgrow start_points=basin_outlet_pt max_cost=0 mem=6000 --overwrite
# # Clump the costgrow analysis to create uniqure ids for each outlet mouth
r.clump -d input=shoreline_outlet_costgrow output=shoreline_outlet_clumps --overwrite
# vectorize the outlet moouths as points
r.to.vect input=shoreline_outlet_clumps output=basin_all_outlets_pt type=point --overwrite
# add columns to the point vector for outlet mouths
v.db.addcolumn map=basin_all_outlets_pt columns="upstream DOUBLE PRECISION, mfdup DOUBLE PRECISION, sfdup DOUBLE PRECISION, elevation DOUBLE PRECISION, mouth_id INT, basin_id INT" --overwrite
# Extract upstream SFD area to the point vector for outlet mouths
v.what.rast map=basin_all_outlets_pt raster=SFDupstream column=sfdup --overwrite
# Extract upstream MFD area to the point vector for outlet mouths
v.what.rast map=basin_all_outlets_pt raster=MFDupstream column=mfdup --overwrite
# Extract elevation to the point vector for outlet mouths
v.what.rast map=basin_all_outlets_pt raster=DEM column=elevation --overwrite
# Extract mouth id to point vector of all basin outlets
v.what.rast map=basin_all_outlets_pt raster=shoreline_outlet_clumps column=mouth_id --overwrite
# Extract bsin id to point vector of all basin outlets
v.what.rast map=basin_all_outlets_pt raster=lowlevel_outlet_clumps column=basin_id --overwrite
# update the point vector of all outlets
db.execute sql="UPDATE basin_all_outlets_pt SET upstream=mfdup" --overwrite
# update the point vector of all outlets
db.execute sql="UPDATE basin_all_outlets_pt SET upstream=sfdup WHERE sfdup>mfdup" --overwrite
# Extract cell coordinates for point vector of basin points
v.to.db map=basin_all_outlets_pt option=coor columns=xcoord,ycoord --overwrite
# Export point vector as ESRI shape file
v.out.ogr type=point input=basin_all_outlets_pt format=ESRI_Shapefile output=/Volumes/Ancillary/ease2n/ESA/tiles/basin-mouths-prel/x04y07/0/outlets-pt_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.shp --overwrite
# Buffer to get all pixels 1 diagonal cell within the shoreline
r.buffer input=shoreline_outlet_clumps output=mouth_buffer distances=128.56 units=meter --overwrite
# Get all cells 1 diagonal from the shoreline
r.mapcalc "thickwall = if((drain_terminal == 1 && mouth_buffer > 1), 1, null())" --overwrite
# get the part of the terminal drainige (e.g. the sea) lay adjacent to land
r.mapcalc "drain_terminal_remain = if((drain_terminal == 1), if ((isnull(thickwall)),1, null() ),null() )" --overwrite
# buffer from the waterline to set anohter shoreline
r.buffer input=drain_terminal_remain output=mouthshoreline distances=128.56 units=meter --overwrite
# Combine the shoreline and waterline data to create a wall at 9999 m.a.s.l just outside the shoreline
r.mapcalc "shorewall = if((mouthshoreline > 0 && thickwall == 1 && isnull(inland_comp_DEM)), 9999, null())" --overwrite
# vectorize the shorewall
r.to.vect input=shorewall output=shorewall_pt type=point --overwrite
# export the shorewall as a point vector (ESRI shape file)
v.out.ogr type=point input=shorewall_pt format=ESRI_Shapefile output=/Volumes/Ancillary/ease2n/ESA/tiles/basin-mouths-prel/x04y07/0/shorewall-pt_copdem_x04y07_0_v01-pfpf-hydrdem4+4-90m.shp --overwrite
# Find any disrepancies in the initial (thicker or diagonal) wall and the final (thinner) wall, the "fillholeDEM" will be used in a later stage.
r.mapcalc "fillholeDEM = if ((thickwall==1 && isnull(shorewall)),9999, null() )" --overwrite
else
# No shoreline to work with in this tile
echo "Inland tile"
fi