lisflood-utilities


Namelisflood-utilities JSON
Version 0.12.22 PyPI version JSON
download
home_pagehttps://github.com/ec-jrc/lisflood-utilities
SummaryA set of utilities for lisflood users. pcr2nc: Convert PCRaster files to netCDF CF 1.6; nc2pcr: Convert netCDF files ot PCRaster format; cutmaps: cut netCDF files;compare: compare two set of netcdf files
upload_time2023-12-01 15:04:42
maintainer
docs_urlNone
authorValerio Lorini, Stefania Grimaldi, Carlo Russo, Domenico Nappo, Lorenzo Alfieri
requires_python
licenseEUPL 1.2
keywords netcdf4 pcraster mapstack lisflood efas glofas ecmwf copernicus
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            # Lisflood Utilities

This repository hosts source code of LISFLOOD utilities.
Go to [Lisflood OS page](https://ec-jrc.github.io/lisflood/) for more information.

Other useful resources

| **Project**         | **Documentation**                                         | **Source code**                                                 |
| ------------------- | --------------------------------------------------------- | --------------------------------------------------------------- |
| Lisflood            | [Model docs](https://ec-jrc.github.io/lisflood-model/)    | https://github.com/ec-jrc/lisflood-code                         |
|                     | [User guide](https://ec-jrc.github.io/lisflood-code/)     |                                                                 |
| Lisvap              | [Docs](https://ec-jrc.github.io/lisflood-lisvap/)         | https://github.com/ec-jrc/lisflood-lisvap                       |
| Calibration tool    | [Docs](https://ec-jrc.github.io/lisflood-calibration/)    | https://github.com/ec-jrc/lisflood-calibration                  |
| Lisflood Utilities  |                                                           | https://github.com/ec-jrc/lisflood-utilities (this repository)  |
| Lisflood Usecases   |                                                           | https://github.com/ec-jrc/lisflood-usecases                     |


## Intro

Lisflood Utilities is a set of tools to help LISFLOOD users (or any users of PCRaster/netCDF files)
to execute some mundane tasks that are necessary to operate lisflood.
Here's a list of utilities you can find in lisflood-utilities package.

* __pcr2nc__ is a tool to convert PCRaster maps to netCDF4 files.
  - convert a single map into a NetCDF4 file
  - convert a time series of maps into a NetCDF4 mapstack
  - support for WGS84 and ETRS89 (LAEA) reference systems
  - fine tuning of output files (compression, significant digits, etc.)
 
* __nc2pcr__ is a tool to convert a netCDF file into PCRaster maps.
  - convert 2D variables in single PCRaster maps
  - netCDF4 mapstacks are not supported yet

* __cutmaps__ is a tool to cut netcdf files in order to reduce size, using either
  - a bounding box of coordinates
  - a bounding box of matrix indices
  - an existing boolean area mask
  - a list of stations and a LDD (in netCDF or PCRaster format) **Note: PCRaster must be installed in the conda env**
 
* __thresholds__ is a tool to compute the discharge return period thresholds from netCDF4 file containing a discharge time series.

* __compare__ is a package containing a set of simple Python classes that helps to compare 
netCDF, PCRaster and TSS files.

* __water-demand-historic__ is a package allowing to generate sectoral (domestic, livestock, industry, and thermoelectric) water demand maps with monthly to yearly temporal steps for a range of past years, at the users’ defined spatial resolution and geographical extent. These maps are required by the LISFLOOD OS [water use module](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use)

* __waterregions__ is a package containing two scripts that allow to create and verify a water regions map, respectively.

The package contains convenient classes for reading/writing:

* PCRasterMap
* PCRasterReader
* NetCDFMap
* NetCDFWriter

### Installation

#### Requisites
The easy way is to use conda environment as they incapsulate C dependencies as well, so you wouldn't need to install libraries.

Otherwise, ensure you have properly installed the following software:

- Python 3.5+
- GDAL C library and software
- netCDF4 C library

#### Install
If you use conda, create a new env (or use an existing one) and install gdal and lisflood-utilities:

```bash
conda create --name myenv python=3.7 -c conda-forge
conda activate myenv
conda install -c conda-forge pcraster gdal
pip install lisflood-utilities
```

If you don't use conda but a straight python3 virtualenv:

```bash
source /path/myenv/bin/activate
pip install lisflood-utilities
```

If GDAL library fails to install, ensure to install the same package version of the
C library you have on your system. You may also need to setup paths to gdal headers.

To check which version of GDAL libraries you have installed on your computer, use gdal-config

```bash
sudo apt-get install libgdal-dev libgdal
export CPLUS_INCLUDE_PATH=/usr/include/gdal
export C_INCLUDE_PATH=/usr/include/gdal
gdal-config --version  # 3.0.1
pip install GDAL==3.0.1
```

Note: if you previously installed an older version of the lisflood-utilitiies, it is highly recommended to remove it before installing the newest version:

```bash
pip uninstall lisflood-utilities
pip install -e./
```


## pcr2nc

### Usage

> __Note:__ This guide assumes you have installed the program with pip tool.
> If you cloned the source code instead, just substitute the executable `pcr2nc` with `python pcr2nc_script.py` that is in the root folder of the cloned project.

The tool takes three command line input arguments:

- -i, --input: It can be a path to a single file, a folder or a unix-like widlcard expression like _/path/to/files/dis00*_
- -o, --output_file: Path to the output nc file
- -m, --metadata: Path to a yaml or json file containing configuration for the NetCDF4 output file.

Unless the input is a single file, the resulting NetCDF4 file will be a mapstack according to a time dimension.

Input as a folder containing PCRaster maps. In this case, the folder must contain ONLY PCraster files and the output will be a mapstack.

```bash
pcr2nc -i /path/to/input/ -o /path/to/output/out.nc -m ./nc_metadata.yaml
```

Input as a path to a single map. In this case, the output won't be a mapstack.

```bash
pcr2nc -i /path/to/input/pcr.map -o /path/to/output/out.nc -m ./nc_metadata.yaml
```

Input as a _Unix style pathname pattern expansion_. The output will be a mapstack. __Note that in this case the input argument must be contained in double quotes!__

```bash
pcr2nc -i "/path/to/input/pcr00*" -o /path/to/output/out.nc -m ./nc_metadata.json
```

#### Writing metadata configuration file

Format of resulting NetCDF4 file is configured into a metadata configuration file. This file can be written in YAML or JSON format.

An example of a metadata configuration file is the following

```yaml
variable:
  shortname: dis
  description: Discharge
  longname: discharge
  units: m3/s
  compression: 9
  least_significant_digit: 2
source: JRC Space, Security, Migration
reference: JRC Space, Security, Migration
geographical:
  datum: WGS84
  variable_x_name: lon
  variable_y_name: lat
time:
  calendar: proleptic_gregorian
  units: days since 1996-01-01

```

#### Variable section

In `variable` section you can configure metadata for the main variable:

- `shortname`: A short name for the variable
- `longname`: The long name version
- `description`: A description for humans
- `units`: The units of the variable
- `compression`: Optional, integer number between 1 and 9, default 0 (no compression). If present the output nc file will be compressed at this level.
- `least_significant_digit`: Optional, integer number, default 2. From NetCDF4 documentation:

> If your data only has a certain number of digits of precision
(say for example, it is temperature data that was measured with a precision
of 0.1 degrees), you can dramatically improve zlib compression by quantizing
(or truncating) the data using the least_significant_digit keyword argument
to createVariable. The least significant digit is the power of ten of the
smallest decimal place in the data that is a reliable value.
For example if the data has a precision of 0.1,
then setting least_significant_digit=1 will cause data the data to be
quantized using `numpy.around(scale*data)/scale`, where `scale = 2**bits`,
and bits is determined so that a precision of 0.1 is retained
(in this case bits=4). Effectively, this makes the compression 'lossy'
instead of 'lossless', that is some precision in the data is sacrificed for the sake of disk space.

#### Source and reference

`source` and `reference` add information for the institution that is providing the NetCDF4 file.

#### Geographical section

In the `geographical` section you can configure `datum` and name of the x and y variables. As `variable_x_name` and `variable_y_name` you should use 'lon' and 'lat' for geographical coordinates (e.g. WGS84) and 'x' and 'y' for projected coordinates (e.g. ETRS89).
 
Currently, pcr2nc supports the following list for `datum`:

  * `WGS84`
  * `ETRS89`
  * `GISCO`

#### Time section

This section is optional and is only required if the output file is a mapstack (a timeseries of georeferenced 2D arrays)
In this section you have to configure `units` and `calendar`.

- `units`: Can be one of the following strings (replacing placeholders with the actual date):
    - `hours since YYYY-MM-DD HH:MM:SS`
    - `days since YYYY-MM-DD`
- `calendar`: A recognized calendar identifier, like `proleptic_gregorian`, `gregorian` etc.

## nc2pcr

This tool converts single maps netCDF (time dimension is not supported yet) to PCRaster format.

### Usage

```bash
nc2pcr -i /path/to/input/file.nc -o /path/to/output/out.map [-c /path/to/clone.map optional]
```

If input file is a LDD map, you must add the `-l` flag:

```bash
nc2pcr -i /path/to/input/ldd.nc -o /path/to/output/ldd.map  -l [-c /path/to/clone.map optional]
```

## Cutmaps: a NetCDF files cookie-cutter

This tool cut netcdf files, using a mask, a bounding box or a list of stations along with a LDD map.  

### Usage:
The tool accepts as input:

* a mask map (either PCRaster or netCDF format) using the -m argument or 
  - alternatively, using the -i argument, matrix indices in the form `imin imax jmin jmax` (imin, imax, jmin, jmax  must be integer numbers)
  - alternatively, using the -c argument, coordinates bounding box in the form `xmin xmax ymin ymax` (xmin, xmax, ymin, ymax can be integer or floating point numbers; x = longitude, y = latitude) 
  - alternatively, using the -N and -l arguments, list of stations with coordinates and a LDD map.
* a path to a folder containing netCDF files to cut or a static dataset path like LISFLOOD static files. 
* a path to a folder where to write cut files.

The following command will cut all netcdf files inside _/workarea/Madeira/lai/_ folder 
and produced files will be writte in current folder. 
The cookie-cutter that will be used is _/workarea/Madeira/maps/MaskMap/Bacia_madeira.nc_. 
This file is a mask (boolean map with 1 only in the area of interest) where cutmaps takes the bounding box from.
The mask can also be in PCRaster format.

```bash
cutmaps -m /workarea/Madeira/maps/MaskMap/Bacia_madeira.nc -f /workarea/Madeira/lai/ -o ./
```

**Indices can also be passed as an argument (using -i argument instead of -m). Knowing your area of interest from your netCDF files, 
you can determine indices of the array and you can pass in the form `imin imax jmin jmax` (imin, imax, jmin, jmax  must be integer numbers).**

```bash
cutmaps -i "150 350 80 180" -f /workarea/Madeira/lai/ -o ./
```

**Example with coordinates (using -c argument) `xmin xmax ymin ymax` (xmin, xmax, ymin, ymax can be integer or floating point numbers; x = longitude, y = latitude) and path to EFAS/GloFAS static data (-S option), with -W to allow overwriting existing files in output directory:**

```bash
cutmaps -S /home/projects/lisflood-eu -c "4078546.12 4463723.85 811206.57 1587655.50" -o /Work/Tunisia/cutmaps -W
```

**Example with stations.txt and LDD**

Given a LDD map and a list of stations in a text file, each row having coordinates X/Y or lon/lat and an index, separated by tabs:

```text
4297500	1572500 1
4292500	1557500 2
4237500	1537500 3
4312500	1482500 4
4187500	1492500 5
```

```bash
cutmaps -S /home/projects/lisflood-eu -l ldd.map -N stations.txt -o /Work/Tunisia/cutmaps
```

If ldd is in netCDF format, LDD will be converted to PCRaster format, first.

```bash
cutmaps -S /home/projects/lisflood-eu -l ldd.nc -N stations.txt -o /Work/Tunisia/cutmaps
``` 

If you experience problems, you can try to pass a path to a PCRaster clone map.

```bash
cutmaps -S /home/projects/lisflood-eu -l ldd.nc -C area.map -N stations.txt -o /Work/Tunisia/cutmaps
```
You will find the produced mask.map and mask.nc for your area in the same folder of ldd map; you will need it for lisflood/lisvap executions.
You will also have outlets.map/outlets.nc based on stations.txt, which let you produce gauges TSS if configured in LISFLOOD.

## compare utility

This tool let you compare two netcdf datasets. You can configure it with tolerances (atol, rtol, thresholds for percentage of tolerated different values).
You can also set the option to write diff files, so that you can inspect maps and differences with a tool like Panoply

```text
usage: compare [-h] -a DATASET_A -b DATASET_B -m MASKAREA [-M SUBMASKAREA]
               [-e] [-s] [-D] [-r RTOL] [-t ATOL] [-p MAX_DIFF_PERCENTAGE]
               [-l MAX_LARGEDIFF_PERCENTAGE]

Compare netCDF outputs: 0.12.12

optional arguments:
  -h, --help            show this help message and exit
  -a DATASET_A, --dataset_a DATASET_A
                        path to dataset version A
  -b DATASET_B, --dataset_b DATASET_B
                        path to dataset version B
  -m MASKAREA, --maskarea MASKAREA
                        path to mask
  -e, --array-equal     flag to compare files to be identical
  -s, --skip-missing    flag to skip missing files in comparison
  -D, --save-diffs      flag to save diffs in netcdf files for visual
                        comparisons. Files are saved in ./diffs folder of
                        current directory.For each file presenting
                        differences, you will find files diffs, original A and
                        B (only for timesteps where differences are found).
  -r RTOL, --rtol RTOL  rtol
  -t ATOL, --atol ATOL  atol
  -p MAX_DIFF_PERCENTAGE, --max-diff-percentage MAX_DIFF_PERCENTAGE
                        threshold for diffs percentage
  -l MAX_LARGEDIFF_PERCENTAGE, --max-largediff-percentage MAX_LARGEDIFF_PERCENTAGE
                        threshold for large diffs percentage
```

## thresholds

The thresholds tool computes the discharge return period thresholds using the method of L-moments.
It is used to post-process the discharge from the LISFLOOD long term run.
The resulting thresholds can be used in a flood forecasting system to define the flood warning levels.

### Usage:
The tool takes as input a Netcdf file containing the annual maxima of the discharge signal. LISFLOOD computes time series of discharge values (average value over the selected computational time step). The users are therefore required to compute the annual maxima. As an example, this step can be achieved by using CDO (cdo yearmax), for all the details please refer to [https://code.mpimet.mpg.de/projects/cdo/embedded/index.html#x1-190001.2.5](https://code.mpimet.mpg.de/projects/cdo/embedded/index.html#x1-190001.2.5)

The output NetCDF file contains the following return period thresholds [1.5, 2, 5, 10, 20, 50, 100, 200, 500], together with the Gumbel parameters (sigma and mu).

```text
usage: thresholds [-h] [-d DISCHARGE] [-o OUTPUT]

Utility to compute the discharge return period thresholds using the method of L-moments.
Thresholds computed: [1.5, 2, 5, 10, 20, 50, 100, 200, 500]

options:
  -h, --help            show this help message and exit
  -d DISCHARGE, --discharge DISCHARGE
                        Input discharge files (annual maxima)
  -o OUTPUT, --output OUTPUT
                        Output thresholds file
```

## water-demand-historic

This utility allows to create water demand maps at the desired resolution and for the desired geographical areas. The maps indicate, for each pixel, the time-varying water demand map to supply for domestic, livestock, industrial, and thermoelectric water consumption. The temporal discretization is monthly for domestic and energy demand, yearly for industrial and livestock demand. The maps of sectoral water demand are required by the LISFLOOD OS [water use module](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/). Clearly, the sectoral water demand maps and the scripts of this utility can be used also for other applications, as well as for stand-alone analysis of historical water demand for anthropogenic use.

#### Input
The creation of the sectoral water demand maps requires a template map that defines the desired geographical area and spatial resolution. The generation of the maps relies on a number of external datasets (examples are the [Global Human Settlement - Datasets - European Commission (europa.eu)](https://ghsl.jrc.ec.europa.eu/datasets.php) and [FAO AQUASTAT Dissemination System](https://data.apps.fao.org/aquastat/?lang=en)). The locations of the template map, of the input datasets and files, of the output folder, and other users’ choices (e.g. start year and end year) are specified in a configuration file. The syntax of the configuration file is pre-defined and an example is provided to the users. The complete list of external datasets, the instructions on how to prepare (i) the external dataset, (ii) the template map, (iii) the input folder, (iv) the output folder, and (v) the configuration file are explained into details [here](src/lisfloodutilities/water-demand-historic/README.md)

#### Output
Four sectoral water demand maps in netCDF-4 format. The geographical extent and the spatial resolution are defined by the template map (users-defined input file). Each netCDF-4 file has 12 months, for each year included in the temporal time span identified by the user. Sectoral water demand data with lower (yearly) temporal resolution are repeated 12 times. 

#### Usage
The methodology includes five main steps. The instructions on how to retrieve the scrips, create the environment including all the required packages, and use the utility are provided [here](src/lisfloodutilities/water-demand-historic/README.md)

#### Important notes on documentation and data availability
The complete list of external datasets, the instructions on how to retrieve the external datasets, the methodology, and the usage of the scripts are explained into details [here](src/lisfloodutilities/water-demand-historic/README.md). The README file provides detailed technical information about the input datasets and the usage of this utility. The methodology is explained in the manuscript: Choulga, M., Moschini, F., Mazzetti, C., Grimaldi, S., Disperati, J., Beck, H., Salamon, P., and Prudhomme, C.: Technical note: Surface fields for global environmental modelling, EGUsphere, 2023 ([preprint](https://doi.org/10.5194/egusphere-2023-1306)).

The global sectoral water demand maps at 3 arcmin (or 0.05 degrees) resolution, 1979-2019, produced using the scripts of this utility can be downloaded from [Joint Research Centre Data Catalogue - LISFLOOD static and parameter maps for GloFAS - European Commission (europa.eu)](https://data.jrc.ec.europa.eu/dataset/68050d73-9c06-499c-a441-dc5053cb0c86)



## waterregions

The modelling of water abstraction for domestic, industrial, energetic, agricoltural and livestock use   can require a map of the water regions. The concept of water regions and information for their definition are explained [here](htpst://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/). 
Since groundwater and surface water resources demand and abstraction are spatially distributed inside each water region, each model set-up must include all the pixels of the water region. This requirement is crucial for the succes of the calibration of the model. This utility allows the user to meet this requirement.
More specifically, this utility can be used to:
1. create a water region map which is consistent with a set of calibration points: this purpose is achieved by using the script define_waterregions.
2. verify the consistency between an existing water region map and an exixting map of calibration catchments: this purpose is achieved by using the script verify_waterregions
It is here reminded that when calibrating a catchment which is a subset of a larger computational domain, and the option wateruse is switched on, then the option groudwatersmooth must be switched off. The explanation of this requirement is provided in the chapter [Water use](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/) of the LISFLOOD documentation. 

#### Requirements
python3, pcraster 4.3. The protocol was tested on Linux.

### define_waterregions
This utility allows to create a  water region map which is consistent with a set of calibration points. The protocol was created by Ad De Roo (Unit D2, Joint Research Centre).

#### Input 
- List of the coordinates of the calibration points. This list must be provided in a .txt file with three columns: LONGITUDE(or x), LATITUDE(or y), point ID.
- LDD map can be in netcdf format or pcraster format. When using pcraster format, the following condition must be satisfied: *PCRASTER_VALUESCALE=VS_LDD*. 
- Countries map in netcdf format or pcraster format. When using pcraster format, the following condition must be satisfied: *PCRASTER_VALUESCALE=VS_NOMINAL*. This map shows the political boundaries of the Countries, each Coutry is identified by using a unique ID. This map is used to ensure that the water regions are not split accross different Countries.
- Map of the initial definition of the water regions in netcdf format or pcraster format. When using pcraster format, the following condition must be satisfied: *PCRASTER_VALUESCALE=VS_NOMINAL*. This map is used to attribute a water region to areas not included in the calibration catchments. In order to create this map, the user can follow the guidelines provided [here](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/).
- file *.yaml* or *.json* to define the metadata of the output water regions map in netcdf format. An example of the structure of these files is provided [here](https://github.com/ec-jrc/lisflood-utilities/tree/master/tests/data/waterregions)

##### Input data provided by this utility:
This utility provides three maps of [Countries IDs](https://github.com/ec-jrc/lisflood-utilities/tree/master/tests/data): 1arcmin map of Europe (EFAS computational domain), 0.1 degree and 3arcmin maps of the Globe . ACKNOWLEDGEMENTS: both the rasters were retrieved by upsampling the original of the World Borders Datase provided by  http://thematicmapping.org/ (the dataset is available under a Creative Commons Attribution-Share Alike License).

#### Output
Map of the water regions which is consistent with the calibration catchments. In other words, each water region is entirely included in one calibration catchment.  The test to check the consistency between the newly created water regions map and the calibration catchments is implemented internally by the code and the outcome of the test is printed on the screen. 
In the output map, each water region is identified by a unique ID. The format of the output map can be netcdf or pcraster.

#### Usage
The following command lines allow to produce a water region map which is consistent with the calibration points (only one commad line is required: each one of the command lines below shows a different combination of input files format):

*python define_waterregions.py -p calib_points_test.txt -l ldd_test.map -C countries_id_test.map -w waterregions_initial_test.map -o my_new_waterregions.map* <br>

*python define_waterregions.py -p calib_points_test.txt -l ldd_test.nc -C countries_id_test.nc -w waterregions_initial_test.nc -o my_new_waterregions.nc -m metadata.test.json* <br>

*python define_waterregions.py -p calib_points_test.txt -l ldd_test.map -C countries_id_test.nc -w waterregions_initial_test.map -o my_new_waterregions.nc -m metadata.test.yaml* <br>


The input maps can be in nectdf format or pcraster format (the same command line can accept a mix of pcraster and netcdf formats).It is imperative to write the file name in full, that is including the extension (which can be either ".nc" or ".map").<br>
The utility can return either a pcraster file or a netcdf file. The users select their preferred format by specifying the extension of the file in the output option (i.e. either ".nc" or ".map"). <br>
The metadata file in .yaml format must be provided only if the output file is in netcdf format.<br>

The code internally verifies that the each one of the newly created water regions is entirely included  within one calibration catchments. If this condition is satisfied, the follwing message in printed out: *“OK! Each water region is completely included inside one calibration catchment”*. If the condition is not satisfied, the error message is *“ERROR: The  water regions WR are included in more than one calibration catchment”*. Moreover, the code provides the list of the water regions WR and the calibration catchments that do not meet the requirment. This error highlight a problem in the input data: the user is recommended to check (and correct) the list of calibration points and the input maps.

The input and output arguments are listed below. 


```text
usage: define_waterregions.py [-h] -p CALIB_POINTS -l LDD -C COUNTRIES_ID -w
                              WATERREGIONS_INITIAL -o OUTPUT_WR

Define Water Regions consistent with calibration points: {}

optional arguments:
  -h, --help            show this help message and exit
  -p CALIB_POINTS, --calib_points CALIB_POINTS
                        list of calibration points: lon or x, lat or y, point id. File extension: .txt,
  -l LDD, --ldd LDD     LDD map, file extension: .nc or .map
  -C COUNTRIES_ID, --countries_id COUNTRIES_ID
                        map of Countries ID, fike extension .nc or .map 
  -w WATERREGIONS_INITIAL, --waterregions_initial WATERREGIONS_INITIAL
                        initial map of water regions, file extension: .nc or .map
  -o OUTPUT_WR, --output_wr OUTPUT_WR
                        output map of water regions, file extension: .nc or .map 
  -m METADATA, --metadata_file METADATA
                        Path to metadata file for NetCDF, .yaml or .json format                     
```



### verify_waterregions

This function allows to verify the consistency between a  water region map and a map of calibration catchments. This function must be used when the water region map and the map of calibration catchments have been defined in an independent manner (i.e. not using the utility **define_waterregions**). The function verify_waterregions verifies that each water region map is entirely included in one calibration catchment. If this condition is not satisfied, an error message is printed on the screen. 

#### Input
- Map of calibration catchments in netcdf format.
- Water regions map in netcdf format.

#### Output
The output is a message on the screen. There are two options:
- 'OK! Each water region is completely included inside one calibration catchment.'
- 'ERROR: The  water regions WR are included in more than one calibration catchment’: this message is followed by the list of the water regions and of the catchment that raised the isuue.
In case of error message, the user can implement the function **define_waterregions**.

#### Usage
The following command line allows to produce a water region map which is consistent with the calibration points:

*python verify_waterregions.py -cc calib_catchments_test.nc -wr waterregions_test.nc*

The input and output arguments are listed below. All the inputs are required. 

```text
usage: verify_waterregions.py [-h] -cc CALIB_CATCHMENTS -wr WATERREGIONS

Verify that the Water Regions map is consistent with the map of the
calibration catchments

optional arguments:
  -h, --help            show this help message and exit
  -cc CALIB_CATCHMENTS, --calib_catchments CALIB_CATCHMENTS
                        map of calibration catchments, netcdf format
  -wr WATERREGIONS, --waterregions WATERREGIONS
                        map of water regions, netcdf format
```

NOTE:
The utility **pcr2nc** can be used to convert a map in pcraster format into netcdf format.




## Using lisfloodutilities programmatically 

You can use lisflood utilities in your python programs. As an example, the script below creates the mask map for a set of stations (stations.txt). The mask map is a boolean map with 1 and 0. 1 is used for all (and only) the pixels hydrologically connected to one of the stations. The resulting mask map is in pcraster format.

```python
from lisfloodutilities.cutmaps.cutlib import mask_from_ldd
from lisfloodutilities.nc2pcr import convert
from lisfloodutilities.readers import PCRasterMap

ldd = 'tests/data/cutmaps/ldd_eu.nc'
clonemap = 'tests/data/cutmaps/area_eu.map'
stations = 'tests/data/cutmaps/stations.txt'

ldd_pcr = convert(ldd, clonemap, 'tests/data/cutmaps/ldd_eu_test.map', is_ldd=True)[0]
mask, outlets_nc, maskmap_nc = mask_from_ldd(ldd_pcr, stations)
mask_map = PCRasterMap(mask)
print(mask_map.data)
```



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ec-jrc/lisflood-utilities",
    "name": "lisflood-utilities",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "netCDF4,PCRaster,mapstack,lisflood,efas,glofas,ecmwf,copernicus",
    "author": "Valerio Lorini, Stefania Grimaldi, Carlo Russo, Domenico Nappo, Lorenzo Alfieri",
    "author_email": "valerio.lorini@ec.europa.eu,stefania.grimaldi@ec.europa.eu,carlo.russo@ext.ec.europa.eu,domenico.nappo@gmail.com,lorenzo.alfieri@ec.europa.eu",
    "download_url": "https://files.pythonhosted.org/packages/64/ca/77768893cf74e6ddfd9764b84eed122a237da501a425b74e3ad82be80507/lisflood-utilities-0.12.22.tar.gz",
    "platform": null,
    "description": "# Lisflood Utilities\n\nThis repository hosts source code of LISFLOOD utilities.\nGo to [Lisflood OS page](https://ec-jrc.github.io/lisflood/) for more information.\n\nOther useful resources\n\n| **Project**         | **Documentation**                                         | **Source code**                                                 |\n| ------------------- | --------------------------------------------------------- | --------------------------------------------------------------- |\n| Lisflood            | [Model docs](https://ec-jrc.github.io/lisflood-model/)    | https://github.com/ec-jrc/lisflood-code                         |\n|                     | [User guide](https://ec-jrc.github.io/lisflood-code/)     |                                                                 |\n| Lisvap              | [Docs](https://ec-jrc.github.io/lisflood-lisvap/)         | https://github.com/ec-jrc/lisflood-lisvap                       |\n| Calibration tool    | [Docs](https://ec-jrc.github.io/lisflood-calibration/)    | https://github.com/ec-jrc/lisflood-calibration                  |\n| Lisflood Utilities  |                                                           | https://github.com/ec-jrc/lisflood-utilities (this repository)  |\n| Lisflood Usecases   |                                                           | https://github.com/ec-jrc/lisflood-usecases                     |\n\n\n## Intro\n\nLisflood Utilities is a set of tools to help LISFLOOD users (or any users of PCRaster/netCDF files)\nto execute some mundane tasks that are necessary to operate lisflood.\nHere's a list of utilities you can find in lisflood-utilities package.\n\n* __pcr2nc__ is a tool to convert PCRaster maps to netCDF4 files.\n  - convert a single map into a NetCDF4 file\n  - convert a time series of maps into a NetCDF4 mapstack\n  - support for WGS84 and ETRS89 (LAEA) reference systems\n  - fine tuning of output files (compression, significant digits, etc.)\n \n* __nc2pcr__ is a tool to convert a netCDF file into PCRaster maps.\n  - convert 2D variables in single PCRaster maps\n  - netCDF4 mapstacks are not supported yet\n\n* __cutmaps__ is a tool to cut netcdf files in order to reduce size, using either\n  - a bounding box of coordinates\n  - a bounding box of matrix indices\n  - an existing boolean area mask\n  - a list of stations and a LDD (in netCDF or PCRaster format) **Note: PCRaster must be installed in the conda env**\n \n* __thresholds__ is a tool to compute the discharge return period thresholds from netCDF4 file containing a discharge time series.\n\n* __compare__ is a package containing a set of simple Python classes that helps to compare \nnetCDF, PCRaster and TSS files.\n\n* __water-demand-historic__ is a package allowing to generate sectoral (domestic, livestock, industry, and thermoelectric) water demand maps with monthly to yearly temporal steps for a range of past years, at the users\u2019 defined spatial resolution and geographical extent. These maps are required by the LISFLOOD OS [water use module](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use)\n\n* __waterregions__ is a package containing two scripts that allow to create and verify a water regions map, respectively.\n\nThe package contains convenient classes for reading/writing:\n\n* PCRasterMap\n* PCRasterReader\n* NetCDFMap\n* NetCDFWriter\n\n### Installation\n\n#### Requisites\nThe easy way is to use conda environment as they incapsulate C dependencies as well, so you wouldn't need to install libraries.\n\nOtherwise, ensure you have properly installed the following software:\n\n- Python 3.5+\n- GDAL C library and software\n- netCDF4 C library\n\n#### Install\nIf you use conda, create a new env (or use an existing one) and install gdal and lisflood-utilities:\n\n```bash\nconda create --name myenv python=3.7 -c conda-forge\nconda activate myenv\nconda install -c conda-forge pcraster gdal\npip install lisflood-utilities\n```\n\nIf you don't use conda but a straight python3 virtualenv:\n\n```bash\nsource /path/myenv/bin/activate\npip install lisflood-utilities\n```\n\nIf GDAL library fails to install, ensure to install the same package version of the\nC library you have on your system. You may also need to setup paths to gdal headers.\n\nTo check which version of GDAL libraries you have installed on your computer, use gdal-config\n\n```bash\nsudo apt-get install libgdal-dev libgdal\nexport CPLUS_INCLUDE_PATH=/usr/include/gdal\nexport C_INCLUDE_PATH=/usr/include/gdal\ngdal-config --version  # 3.0.1\npip install GDAL==3.0.1\n```\n\nNote: if you previously installed an older version of the lisflood-utilitiies, it is highly recommended to remove it before installing the newest version:\n\n```bash\npip uninstall lisflood-utilities\npip install -e./\n```\n\n\n## pcr2nc\n\n### Usage\n\n> __Note:__ This guide assumes you have installed the program with pip tool.\n> If you cloned the source code instead, just substitute the executable `pcr2nc` with `python pcr2nc_script.py` that is in the root folder of the cloned project.\n\nThe tool takes three command line input arguments:\n\n- -i, --input: It can be a path to a single file, a folder or a unix-like widlcard expression like _/path/to/files/dis00*_\n- -o, --output_file: Path to the output nc file\n- -m, --metadata: Path to a yaml or json file containing configuration for the NetCDF4 output file.\n\nUnless the input is a single file, the resulting NetCDF4 file will be a mapstack according to a time dimension.\n\nInput as a folder containing PCRaster maps. In this case, the folder must contain ONLY PCraster files and the output will be a mapstack.\n\n```bash\npcr2nc -i /path/to/input/ -o /path/to/output/out.nc -m ./nc_metadata.yaml\n```\n\nInput as a path to a single map. In this case, the output won't be a mapstack.\n\n```bash\npcr2nc -i /path/to/input/pcr.map -o /path/to/output/out.nc -m ./nc_metadata.yaml\n```\n\nInput as a _Unix style pathname pattern expansion_. The output will be a mapstack. __Note that in this case the input argument must be contained in double quotes!__\n\n```bash\npcr2nc -i \"/path/to/input/pcr00*\" -o /path/to/output/out.nc -m ./nc_metadata.json\n```\n\n#### Writing metadata configuration file\n\nFormat of resulting NetCDF4 file is configured into a metadata configuration file. This file can be written in YAML or JSON format.\n\nAn example of a metadata configuration file is the following\n\n```yaml\nvariable:\n  shortname: dis\n  description: Discharge\n  longname: discharge\n  units: m3/s\n  compression: 9\n  least_significant_digit: 2\nsource: JRC Space, Security, Migration\nreference: JRC Space, Security, Migration\ngeographical:\n  datum: WGS84\n  variable_x_name: lon\n\u00a0 variable_y_name: lat\ntime:\n  calendar: proleptic_gregorian\n  units: days since 1996-01-01\n\n```\n\n#### Variable section\n\nIn `variable` section you can configure metadata for the main variable:\n\n- `shortname`: A short name for the variable\n- `longname`: The long name version\n- `description`: A description for humans\n- `units`: The units of the variable\n- `compression`: Optional, integer number between 1 and 9, default 0 (no compression). If present the output nc file will be compressed at this level.\n- `least_significant_digit`: Optional, integer number, default 2. From NetCDF4 documentation:\n\n> If your data only has a certain number of digits of precision\n(say for example, it is temperature data that was measured with a precision\nof 0.1 degrees), you can dramatically improve zlib compression by quantizing\n(or truncating) the data using the least_significant_digit keyword argument\nto createVariable. The least significant digit is the power of ten of the\nsmallest decimal place in the data that is a reliable value.\nFor example if the data has a precision of 0.1,\nthen setting least_significant_digit=1 will cause data the data to be\nquantized using `numpy.around(scale*data)/scale`, where `scale = 2**bits`,\nand bits is determined so that a precision of 0.1 is retained\n(in this case bits=4). Effectively, this makes the compression 'lossy'\ninstead of 'lossless', that is some precision in the data is sacrificed for the sake of disk space.\n\n#### Source and reference\n\n`source` and `reference` add information for the institution that is providing the NetCDF4 file.\n\n#### Geographical section\n\nIn the `geographical` section you can configure `datum` and name of the x and y variables. As `variable_x_name` and `variable_y_name` you should use 'lon' and 'lat' for geographical coordinates (e.g. WGS84) and 'x' and 'y' for projected coordinates (e.g. ETRS89).\n \nCurrently, pcr2nc supports the following list for `datum`:\n\n  * `WGS84`\n  * `ETRS89`\n  * `GISCO`\n\n#### Time section\n\nThis section is optional and is only required if the output file is a mapstack (a timeseries of georeferenced 2D arrays)\nIn this section you have to configure `units` and `calendar`.\n\n- `units`: Can be one of the following strings (replacing placeholders with the actual date):\n    - `hours since YYYY-MM-DD HH:MM:SS`\n    - `days since YYYY-MM-DD`\n- `calendar`: A recognized calendar identifier, like `proleptic_gregorian`, `gregorian` etc.\n\n## nc2pcr\n\nThis tool converts single maps netCDF (time dimension is not supported yet) to PCRaster format.\n\n### Usage\n\n```bash\nnc2pcr -i /path/to/input/file.nc -o /path/to/output/out.map [-c /path/to/clone.map optional]\n```\n\nIf input file is a LDD map, you must add the `-l` flag:\n\n```bash\nnc2pcr -i /path/to/input/ldd.nc -o /path/to/output/ldd.map  -l [-c /path/to/clone.map optional]\n```\n\n## Cutmaps: a NetCDF files cookie-cutter\n\nThis tool cut netcdf files, using a mask, a bounding box or a list of stations along with a LDD map.  \n\n### Usage:\nThe tool accepts as input:\n\n* a mask map (either PCRaster or netCDF format) using the -m argument or \n  - alternatively, using the -i argument, matrix indices in the form `imin imax jmin jmax` (imin, imax, jmin, jmax  must be integer numbers)\n  - alternatively, using the -c argument, coordinates bounding box in the form `xmin xmax ymin ymax` (xmin, xmax, ymin, ymax can be integer or floating point numbers; x = longitude, y = latitude) \n  - alternatively, using the -N and -l arguments, list of stations with coordinates and a LDD map.\n* a path to a folder containing netCDF files to cut or a static dataset path like LISFLOOD static files. \n* a path to a folder where to write cut files.\n\nThe following command will cut all netcdf files inside _/workarea/Madeira/lai/_ folder \nand produced files will be writte in current folder. \nThe cookie-cutter that will be used is _/workarea/Madeira/maps/MaskMap/Bacia_madeira.nc_. \nThis file is a mask (boolean map with 1 only in the area of interest) where cutmaps takes the bounding box from.\nThe mask can also be in PCRaster format.\n\n```bash\ncutmaps -m /workarea/Madeira/maps/MaskMap/Bacia_madeira.nc -f /workarea/Madeira/lai/ -o ./\n```\n\n**Indices can also be passed as an argument (using -i argument instead of -m). Knowing your area of interest from your netCDF files, \nyou can determine indices of the array and you can pass in the form `imin imax jmin jmax` (imin, imax, jmin, jmax  must be integer numbers).**\n\n```bash\ncutmaps -i \"150 350 80 180\" -f /workarea/Madeira/lai/ -o ./\n```\n\n**Example with coordinates (using -c argument) `xmin xmax ymin ymax` (xmin, xmax, ymin, ymax can be integer or floating point numbers; x = longitude, y = latitude) and path to EFAS/GloFAS static data (-S option), with -W to allow overwriting existing files in output directory:**\n\n```bash\ncutmaps -S /home/projects/lisflood-eu -c \"4078546.12 4463723.85 811206.57 1587655.50\" -o /Work/Tunisia/cutmaps -W\n```\n\n**Example with stations.txt and LDD**\n\nGiven a LDD map and a list of stations in a text file, each row having coordinates X/Y or lon/lat and an index, separated by tabs:\n\n```text\n4297500\t1572500 1\n4292500\t1557500 2\n4237500\t1537500 3\n4312500\t1482500 4\n4187500\t1492500 5\n```\n\n```bash\ncutmaps -S /home/projects/lisflood-eu -l ldd.map -N stations.txt -o /Work/Tunisia/cutmaps\n```\n\nIf ldd is in netCDF format, LDD will be converted to PCRaster format, first.\n\n```bash\ncutmaps -S /home/projects/lisflood-eu -l ldd.nc -N stations.txt -o /Work/Tunisia/cutmaps\n``` \n\nIf you experience problems, you can try to pass a path to a PCRaster clone map.\n\n```bash\ncutmaps -S /home/projects/lisflood-eu -l ldd.nc -C area.map -N stations.txt -o /Work/Tunisia/cutmaps\n```\nYou will find the produced mask.map and mask.nc for your area in the same folder of ldd map; you will need it for lisflood/lisvap executions.\nYou will also have outlets.map/outlets.nc based on stations.txt, which let you produce gauges TSS if configured in LISFLOOD.\n\n## compare utility\n\nThis tool let you compare two netcdf datasets. You can configure it with tolerances (atol, rtol, thresholds for percentage of tolerated different values).\nYou can also set the option to write diff files, so that you can inspect maps and differences with a tool like Panoply\n\n```text\nusage: compare [-h] -a DATASET_A -b DATASET_B -m MASKAREA [-M SUBMASKAREA]\n               [-e] [-s] [-D] [-r RTOL] [-t ATOL] [-p MAX_DIFF_PERCENTAGE]\n               [-l MAX_LARGEDIFF_PERCENTAGE]\n\nCompare netCDF outputs: 0.12.12\n\noptional arguments:\n  -h, --help            show this help message and exit\n  -a DATASET_A, --dataset_a DATASET_A\n                        path to dataset version A\n  -b DATASET_B, --dataset_b DATASET_B\n                        path to dataset version B\n  -m MASKAREA, --maskarea MASKAREA\n                        path to mask\n  -e, --array-equal     flag to compare files to be identical\n  -s, --skip-missing    flag to skip missing files in comparison\n  -D, --save-diffs      flag to save diffs in netcdf files for visual\n                        comparisons. Files are saved in ./diffs folder of\n                        current directory.For each file presenting\n                        differences, you will find files diffs, original A and\n                        B (only for timesteps where differences are found).\n  -r RTOL, --rtol RTOL  rtol\n  -t ATOL, --atol ATOL  atol\n  -p MAX_DIFF_PERCENTAGE, --max-diff-percentage MAX_DIFF_PERCENTAGE\n                        threshold for diffs percentage\n  -l MAX_LARGEDIFF_PERCENTAGE, --max-largediff-percentage MAX_LARGEDIFF_PERCENTAGE\n                        threshold for large diffs percentage\n```\n\n## thresholds\n\nThe thresholds tool computes the discharge return period thresholds using the method of L-moments.\nIt is used to post-process the discharge from the LISFLOOD long term run.\nThe resulting thresholds can be used in a flood forecasting system to define the flood warning levels.\n\n### Usage:\nThe tool takes as input a Netcdf file containing the annual maxima of the discharge signal. LISFLOOD computes time series of discharge values (average value over the selected computational time step). The users are therefore required to compute the annual maxima. As an example, this step can be achieved by using CDO (cdo yearmax), for all the details please refer to [https://code.mpimet.mpg.de/projects/cdo/embedded/index.html#x1-190001.2.5](https://code.mpimet.mpg.de/projects/cdo/embedded/index.html#x1-190001.2.5)\n\nThe output NetCDF file contains the following return period thresholds [1.5, 2, 5, 10, 20, 50, 100, 200, 500], together with the Gumbel parameters (sigma and mu).\n\n```text\nusage: thresholds [-h] [-d DISCHARGE] [-o OUTPUT]\n\nUtility to compute the discharge return period thresholds using the method of L-moments.\nThresholds computed: [1.5, 2, 5, 10, 20, 50, 100, 200, 500]\n\noptions:\n  -h, --help            show this help message and exit\n  -d DISCHARGE, --discharge DISCHARGE\n                        Input discharge files (annual maxima)\n  -o OUTPUT, --output OUTPUT\n                        Output thresholds file\n```\n\n## water-demand-historic\n\nThis utility allows to create water demand maps at the desired resolution and for the desired geographical areas. The maps indicate, for each pixel, the time-varying water demand map to supply for domestic, livestock, industrial, and thermoelectric water consumption. The temporal discretization is monthly for domestic and energy demand, yearly for industrial and livestock demand. The maps of sectoral water demand are required by the LISFLOOD OS [water use module](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/). Clearly, the sectoral water demand maps and the scripts of this utility can be used also for other applications, as well as for stand-alone analysis of historical water demand for anthropogenic use.\n\n#### Input\nThe creation of the sectoral water demand maps requires a template map that defines the desired geographical area and spatial resolution. The generation of the maps relies on a number of external datasets (examples are the [Global Human Settlement - Datasets - European Commission (europa.eu)](https://ghsl.jrc.ec.europa.eu/datasets.php) and [FAO AQUASTAT Dissemination System](https://data.apps.fao.org/aquastat/?lang=en)). The locations of the template map, of the input datasets and files, of the output folder, and other users\u2019 choices (e.g. start year and end year) are specified in a configuration file. The syntax of the configuration file is pre-defined and an example is provided to the users. The complete list of external datasets, the instructions on how to prepare (i) the external dataset, (ii) the template map, (iii) the input folder, (iv) the output folder, and (v) the configuration file are explained into details [here](src/lisfloodutilities/water-demand-historic/README.md)\n\n#### Output\nFour sectoral water demand maps in netCDF-4 format. The geographical extent and the spatial resolution are defined by the template map (users-defined input file). Each netCDF-4 file has 12 months, for each year included in the temporal time span identified by the user. Sectoral water demand data with lower (yearly) temporal resolution are repeated 12 times. \n\n#### Usage\nThe methodology includes five main steps. The instructions on how to retrieve the scrips, create the environment including all the required packages, and use the utility are provided [here](src/lisfloodutilities/water-demand-historic/README.md)\n\n#### Important notes on documentation and data availability\nThe complete list of external datasets, the instructions on how to retrieve the external datasets, the methodology, and the usage of the scripts are explained into details [here](src/lisfloodutilities/water-demand-historic/README.md). The README file provides detailed technical information about the input datasets and the usage of this utility. The methodology is explained in the manuscript: Choulga, M., Moschini, F., Mazzetti, C., Grimaldi, S., Disperati, J., Beck, H., Salamon, P., and Prudhomme, C.: Technical note: Surface fields for global environmental modelling, EGUsphere, 2023 ([preprint](https://doi.org/10.5194/egusphere-2023-1306)).\n\nThe global sectoral water demand maps at 3 arcmin (or 0.05 degrees) resolution, 1979-2019, produced using the scripts of this utility can be downloaded from [Joint Research Centre Data Catalogue - LISFLOOD static and parameter maps for GloFAS - European Commission (europa.eu)](https://data.jrc.ec.europa.eu/dataset/68050d73-9c06-499c-a441-dc5053cb0c86)\n\n\n\n## waterregions\n\nThe modelling of water abstraction for domestic, industrial, energetic, agricoltural and livestock use   can require a map of the water regions. The concept of water regions and information for their definition are explained [here](htpst://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/). \nSince groundwater and surface water resources demand and abstraction are spatially distributed inside each water region, each model set-up must include all the pixels of the water region. This requirement is crucial for the succes of the calibration of the model. This utility allows the user to meet this requirement.\nMore specifically, this utility can be used to:\n1. create a water region map which is consistent with a set of calibration points: this purpose is achieved by using the script define_waterregions.\n2. verify the consistency between an existing water region map and an exixting map of calibration catchments: this purpose is achieved by using the script verify_waterregions\nIt is here reminded that when calibrating a catchment which is a subset of a larger computational domain, and the option wateruse is switched on, then the option groudwatersmooth must be switched off. The explanation of this requirement is provided in the chapter [Water use](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/) of the LISFLOOD documentation. \n\n#### Requirements\npython3, pcraster 4.3. The protocol was tested on Linux.\n\n### define_waterregions\nThis utility allows to create a  water region map which is consistent with a set of calibration points. The protocol was created by Ad De Roo (Unit D2, Joint Research Centre).\n\n#### Input \n- List of the coordinates of the calibration points. This list must be provided in a .txt file with three columns: LONGITUDE(or x), LATITUDE(or y), point ID.\n- LDD map can be in netcdf format or pcraster format. When using pcraster format, the following condition must be satisfied: *PCRASTER_VALUESCALE=VS_LDD*. \n- Countries map in netcdf format or pcraster format. When using pcraster format, the following condition must be satisfied: *PCRASTER_VALUESCALE=VS_NOMINAL*. This map shows the political boundaries of the Countries, each Coutry is identified by using a unique ID. This map is used to ensure that the water regions are not split accross different Countries.\n- Map of the initial definition of the water regions in netcdf format or pcraster format. When using pcraster format, the following condition must be satisfied: *PCRASTER_VALUESCALE=VS_NOMINAL*. This map is used to attribute a water region to areas not included in the calibration catchments. In order to create this map, the user can follow the guidelines provided [here](https://ec-jrc.github.io/lisflood-model/2_18_stdLISFLOOD_water-use/).\n- file *.yaml* or *.json* to define the metadata of the output water regions map in netcdf format. An example of the structure of these files is provided [here](https://github.com/ec-jrc/lisflood-utilities/tree/master/tests/data/waterregions)\n\n##### Input data provided by this utility:\nThis utility provides three maps of [Countries IDs](https://github.com/ec-jrc/lisflood-utilities/tree/master/tests/data): 1arcmin map of Europe (EFAS computational domain), 0.1 degree and 3arcmin maps of the Globe . ACKNOWLEDGEMENTS: both the rasters were retrieved by upsampling the original of the World Borders Datase provided by  http://thematicmapping.org/ (the dataset is available under a Creative Commons Attribution-Share Alike License).\n\n#### Output\nMap of the water regions which is consistent with the calibration catchments. In other words, each water region is entirely included in one calibration catchment.  The test to check the consistency between the newly created water regions map and the calibration catchments is implemented internally by the code and the outcome of the test is printed on the screen. \nIn the output map, each water region is identified by a unique ID. The format of the output map can be netcdf or pcraster.\n\n#### Usage\nThe following command lines allow to produce a water region map which is consistent with the calibration points (only one commad line is required: each one of the command lines below shows a different combination of input files format):\n\n*python define_waterregions.py -p calib_points_test.txt -l ldd_test.map -C countries_id_test.map -w waterregions_initial_test.map -o my_new_waterregions.map* <br>\n\n*python define_waterregions.py -p calib_points_test.txt -l ldd_test.nc -C countries_id_test.nc -w waterregions_initial_test.nc -o my_new_waterregions.nc -m metadata.test.json* <br>\n\n*python define_waterregions.py -p calib_points_test.txt -l ldd_test.map -C countries_id_test.nc -w waterregions_initial_test.map -o my_new_waterregions.nc -m metadata.test.yaml* <br>\n\n\nThe input maps can be in nectdf format or pcraster format (the same command line can accept a mix of pcraster and netcdf formats).It is imperative to write the file name in full, that is including the extension (which can be either \".nc\" or \".map\").<br>\nThe utility can return either a pcraster file or a netcdf file. The users select their preferred format by specifying the extension of the file in the output option (i.e. either \".nc\" or \".map\"). <br>\nThe metadata file in .yaml format must be provided only if the output file is in netcdf format.<br>\n\nThe code internally verifies that the each one of the newly created water regions is entirely included  within one calibration catchments. If this condition is satisfied, the follwing message in printed out: *\u201cOK! Each water region is completely included inside one calibration catchment\u201d*. If the condition is not satisfied, the error message is *\u201cERROR: The  water regions WR are included in more than one calibration catchment\u201d*. Moreover, the code provides the list of the water regions WR and the calibration catchments that do not meet the requirment. This error highlight a problem in the input data: the user is recommended to check (and correct) the list of calibration points and the input maps.\n\nThe input and output arguments are listed below. \n\n\n```text\nusage: define_waterregions.py [-h] -p CALIB_POINTS -l LDD -C COUNTRIES_ID -w\n                              WATERREGIONS_INITIAL -o OUTPUT_WR\n\nDefine Water Regions consistent with calibration points: {}\n\noptional arguments:\n  -h, --help            show this help message and exit\n  -p CALIB_POINTS, --calib_points CALIB_POINTS\n                        list of calibration points: lon or x, lat or y, point id. File extension: .txt,\n  -l LDD, --ldd LDD     LDD map, file extension: .nc or .map\n  -C COUNTRIES_ID, --countries_id COUNTRIES_ID\n                        map of Countries ID, fike extension .nc or .map \n  -w WATERREGIONS_INITIAL, --waterregions_initial WATERREGIONS_INITIAL\n                        initial map of water regions, file extension: .nc or .map\n  -o OUTPUT_WR, --output_wr OUTPUT_WR\n                        output map of water regions, file extension: .nc or .map \n  -m METADATA, --metadata_file METADATA\n                        Path to metadata file for NetCDF, .yaml or .json format                     \n```\n\n\n\n### verify_waterregions\n\nThis function allows to verify the consistency between a  water region map and a map of calibration catchments. This function must be used when the water region map and the map of calibration catchments have been defined in an independent manner (i.e. not using the utility **define_waterregions**). The function verify_waterregions verifies that each water region map is entirely included in one calibration catchment. If this condition is not satisfied, an error message is printed on the screen. \n\n#### Input\n- Map of calibration catchments in netcdf format.\n- Water regions map in netcdf format.\n\n#### Output\nThe output is a message on the screen. There are two options:\n- 'OK! Each water region is completely included inside one calibration catchment.'\n- 'ERROR: The  water regions WR are included in more than one calibration catchment\u2019: this message is followed by the list of the water regions and of the catchment that raised the isuue.\nIn case of error message, the user can implement the function **define_waterregions**.\n\n#### Usage\nThe following command line allows to produce a water region map which is consistent with the calibration points:\n\n*python verify_waterregions.py -cc calib_catchments_test.nc -wr waterregions_test.nc*\n\nThe input and output arguments are listed below. All the inputs are required. \n\n```text\nusage: verify_waterregions.py [-h] -cc CALIB_CATCHMENTS -wr WATERREGIONS\n\nVerify that the Water Regions map is consistent with the map of the\ncalibration catchments\n\noptional arguments:\n  -h, --help            show this help message and exit\n  -cc CALIB_CATCHMENTS, --calib_catchments CALIB_CATCHMENTS\n                        map of calibration catchments, netcdf format\n  -wr WATERREGIONS, --waterregions WATERREGIONS\n                        map of water regions, netcdf format\n```\n\nNOTE:\nThe utility **pcr2nc** can be used to convert a map in pcraster format into netcdf format.\n\n\n\n\n## Using lisfloodutilities programmatically \n\nYou can use lisflood utilities in your python programs. As an example, the script below creates the mask map for a set of stations (stations.txt). The mask map is a boolean map with 1 and 0. 1 is used for all (and only) the pixels hydrologically connected to one of the stations. The resulting mask map is in pcraster format.\n\n```python\nfrom lisfloodutilities.cutmaps.cutlib import mask_from_ldd\nfrom lisfloodutilities.nc2pcr import convert\nfrom lisfloodutilities.readers import PCRasterMap\n\nldd = 'tests/data/cutmaps/ldd_eu.nc'\nclonemap = 'tests/data/cutmaps/area_eu.map'\nstations = 'tests/data/cutmaps/stations.txt'\n\nldd_pcr = convert(ldd, clonemap, 'tests/data/cutmaps/ldd_eu_test.map', is_ldd=True)[0]\nmask, outlets_nc, maskmap_nc = mask_from_ldd(ldd_pcr, stations)\nmask_map = PCRasterMap(mask)\nprint(mask_map.data)\n```\n\n\n",
    "bugtrack_url": null,
    "license": "EUPL 1.2",
    "summary": "A set of utilities for lisflood users. pcr2nc: Convert PCRaster files to netCDF CF 1.6; nc2pcr: Convert netCDF files ot PCRaster format; cutmaps: cut netCDF files;compare: compare two set of netcdf files",
    "version": "0.12.22",
    "project_urls": {
        "Homepage": "https://github.com/ec-jrc/lisflood-utilities"
    },
    "split_keywords": [
        "netcdf4",
        "pcraster",
        "mapstack",
        "lisflood",
        "efas",
        "glofas",
        "ecmwf",
        "copernicus"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "64ca77768893cf74e6ddfd9764b84eed122a237da501a425b74e3ad82be80507",
                "md5": "2357ccc878dcdc7fac2d5d9452a6b302",
                "sha256": "94e4ee5c8bbd98d681398a1e46b66317b08ef2db65003cfe48e200519a1a1968"
            },
            "downloads": -1,
            "filename": "lisflood-utilities-0.12.22.tar.gz",
            "has_sig": false,
            "md5_digest": "2357ccc878dcdc7fac2d5d9452a6b302",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 52635,
            "upload_time": "2023-12-01T15:04:42",
            "upload_time_iso_8601": "2023-12-01T15:04:42.579453Z",
            "url": "https://files.pythonhosted.org/packages/64/ca/77768893cf74e6ddfd9764b84eed122a237da501a425b74e3ad82be80507/lisflood-utilities-0.12.22.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-01 15:04:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ec-jrc",
    "github_project": "lisflood-utilities",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "requirements": [],
    "lcname": "lisflood-utilities"
}
        
Elapsed time: 0.14327s