WCS Wrapper Configuration for Cubes
Serving Data from complete NetCDF-CF files
The best thing is to have all your heterogeneous data in one NetCDF file. In that case, your data shares all the main dimensions: Latitude, Longitude, Time, Elevation, Wavelength etc... This enables the WCS service to filter out map slices, time series or cubes of any combination of the dimensions.
Copy your NetCDF file to web/static/myproject
- Every NetCDF file becomes a coverage.
- Every Variable in the file becomes a field.
The configuration is automatic, /OWS/web/owsadmin.py script extracts the metadata from these cubes and you are ready to run.
python owsadmin.py wcs_prepare -ao
This extracts all the metadata from all the providers.
Serving data from periodic collection of NetCDF files
Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but for a terabyte of files, that may not be an option. The HTAP demo provider is an example how to do just this. Look at the HTAP_wcs.py and HTAP_config.py.
To try it out yourself, download custom-netcdf-1.2.1.zip It contains folders GEMAQ-v1p0, GEOSChem-v45 and web. web contains the code, other folders contain the data. Copy these folders in /OWS so that the "web" folder is copied on top of the existing "web". Under web/static, HTAP is a custom grid demo, CIRA is the point demo, irrelevant here.
The custom handler does three things.
- It gets the datetime from the TimeSequence parameter. Time ranges are reported as error.
- It finds the daily netcdf file by the datetime.
- It redirects the extractor to get the subcube from that file
To achieve this it's using standard object-oriented inheritance and method overriding.