wps_CI¶

wps_CI is a process that runs the ci.netcdf.wrapper function of ClimDown package. To get started, first instatiate the client. Here, the client will try to connect to a remote chickadee instance using the url parameter.¶

In [1]:
import os
from birdy import WPSClient
from wps_tools.testing import get_target_url
from importlib.resources import files
from tempfile import NamedTemporaryFile
from netCDF4 import Dataset

# Ensure we are in the working directory with access to the data
while os.path.basename(os.getcwd()) != "chickadee":
    os.chdir('../')
In [2]:
# NBVAL_IGNORE_OUTPUT
url = get_target_url("chickadee")
print(f"Using chickadee on {url}")
Using chickadee on https://marble-dev01.pcic.uvic.ca/twitcher/ows/proxy/chickadee/wps
In [3]:
chickadee = WPSClient(url)

Help for individual processes can be diplayed using the ? command (ex. bird.process?).¶

In [4]:
# NBVAL_IGNORE_OUTPUT
chickadee.ci?
Signature:
chickadee.ci(
    gcm_file,
    obs_file=None,
    varname=None,
    num_cores='4',
    loglevel='INFO',
    units_bool=True,
    n_pr_bool=True,
    tasmax_units='celsius',
    tasmin_units='celsius',
    pr_units='kg m-2 d-1',
    max_gb=1.0,
    start_date=datetime.date(1971, 1, 1),
    end_date=datetime.date(2005, 12, 31),
    out_file=None,
)
Docstring:
Climate Imprint (CI) downscaling

Parameters
----------
gcm_file : ComplexData:mimetype:`application/x-netcdf`, :mimetype:`application/x-ogc-dods`
    Filename of GCM simulations
obs_file : ComplexData:mimetype:`application/x-netcdf`, :mimetype:`application/x-ogc-dods`
    Filename of high-res gridded historical observations
varname : string
    Name of the NetCDF variable to downscale (e.g. 'tasmax')
out_file : string
    Filename to create with the climate imprint outputs
num_cores : {'1', '2', '3', '4'}positiveInteger
    The number of cores to use for parallel execution
loglevel : {'CRITICAL', 'ERROR', 'WARNING', 'INFO', 'DEBUG', 'NOTSET'}string
    Logging level
units_bool : boolean
    Check the input units and convert them to the target output units
n_pr_bool : boolean
    Check for and eliminate negative precipitation values
tasmax_units : string
    Units used for tasmax in output file
tasmin_units : string
    Units used for tasmin in output file
pr_units : string
    Units used for pr in output file
max_gb : float
    Anapproximately how much RAM to use in the chunk I/O loop. It’s best to set this to about 1/3 to 1/4 of what you want the high-water mark to be
start_date : date
    Defines the stat of the calibration period
end_date : date
    Defines the end of the calibration period

Returns
-------
output : ComplexData:mimetype:`application/x-netcdf`
    Output Netcdf File
File:      ~/code/chickadee/</home/eyvorchuk/code/chickadee/chickadee-venv/lib/python3.8/site-packages/birdy/client/base.py-1>
Type:      method

We can use the docstring to ensure we provide the appropriate parameters.¶

In [8]:
with NamedTemporaryFile(suffix=".nc", prefix="output_", dir="/tmp", delete=True) as out_file:
    output = chickadee.ci(
        gcm_file = (files("tests") / "data/tiny_gcm.nc").resolve(),
        obs_file = (files("tests") / "data/tiny_obs.nc").resolve(),
        out_file = out_file.name,
        varname="tasmax",
    )

Access the output with nc_to_dataset() or auto_construct_outputs() from wps_tools.output_handling

In [9]:
# NBVAL_IGNORE_OUTPUT
from wps_tools.output_handling import nc_to_dataset, auto_construct_outputs

output_dataset = nc_to_dataset(output.get()[0])
output_dataset
Out[9]:
<class 'netCDF4._netCDF4.Dataset'>
root group (NETCDF3_CLASSIC data model, file format NETCDF3):
    dimensions(sizes): lon(26), lat(26), time(3651)
    variables(dimensions): float64 lon(lon), float64 lat(lat), float64 time(time), float32 tasmax(time, lat, lon)
    groups: 
In [10]:
# NBVAL_IGNORE_OUTPUT
auto_construct_outputs(output.get())
Out[10]:
[<class 'netCDF4._netCDF4.Dataset'>
 root group (NETCDF3_CLASSIC data model, file format NETCDF3):
     dimensions(sizes): lon(26), lat(26), time(3651)
     variables(dimensions): float64 lon(lon), float64 lat(lat), float64 time(time), float32 tasmax(time, lat, lon)
     groups: ]

Once the process has completed we can extract the results and ensure it is what we expected.¶

In [11]:
expected_data = Dataset((files("tests") / "data/CI_expected_output.nc").resolve())
for key, value in expected_data.dimensions.items():
    assert str(output_dataset.dimensions[key]) == str(value)
In [13]:
output.get()[0]
Out[13]:
'https://marble-dev01.pcic.uvic.ca/wpsoutputs/f0cca536-303f-11f0-a57a-0242ac120006/output_qf7vkijf.nc'
In [ ]: