Difference between revisions of "WCS Wrapper Configuration for Cubes"

From Earth Science Information Partners (ESIP)
Line 24: Line 24:
 
== Serving data from periodic collection of NetCDF files ==
 
== Serving data from periodic collection of NetCDF files ==
  
Download [http://voxel.dl.sourceforge.net/project/aq-ogc-services/custom-netcdf-1.2.0.zip custom-netcdf-1.2.0.zip] It contains folders GEMAQ-v1p0, GEOSChem-v45 and web. Copy these folders in /OWS so that the "web" folder is copied on top of the existing "web". Under web/static, HTAP is a custom grid demo, CIRA is the point demo.
+
Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but you for a terabyte of files, that may not be an option. The HTAP demo provider is an example how to do just this. Look at the [http://128.252.202.19:8080/static/HTAP/HTAP_wcs.py HTAP_wcs.py] and [http://128.252.202.19:8080/static/HTAP/HTAP_config.py HTAP_config.py].
  
Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but you for a terabyte of files, that may not be an option. The HTAP demo provider is an example how to do just this. It only has two days of data to make download small. Look at the [http://128.252.202.19:8080/static/HTAP/HTAP_wcs.py HTAP_wcs.py] and [http://128.252.202.19:8080/static/HTAP/HTAP_config.py HTAP_config.py].
+
To try it out yourself, download [http://sourceforge.net/projects/aq-ogc-services/files/custom-netcdf-1.2.0.zip/download custom-netcdf-1.2.0.zip] It contains folders GEMAQ-v1p0, GEOSChem-v45 and web. web contains the code, other folders contain the data. Copy these folders in /OWS so that the "web" folder is copied on top of the existing "web". Under web/static, HTAP is a custom grid demo, CIRA is the point demo, irrelevant here.
  
 
The custom handler does three things.  
 
The custom handler does three things.  

Revision as of 08:32, July 30, 2010

Back to WCS Wrapper

Back to WCS Wrapper Configuration

Project on SourceForge

Serving Data from complete NetCDF-CF files

Read How to pack your data to NetCDF Files.

The best thing is to have all your heterogeneous data in one NetCDF file. In that case, your data shares all the main dimensions: Latitude, Longitude and Time. This enables the WCS service to filter out both map slices, time series or some combination of the two.

Copy your NetCDF file to web/static/myproject

  • Every NetCDF file becomes a coverage.
  • Every Variable in the file becomes a field.

The configuration is automatic, /OWS/web/owsadmin.py script extracts the metadata from these cubes and you are ready to run.

   python owsadmin.py wcs_prepare -ao 

This extracts all the metadata from all the providers.

Serving data from periodic collection of NetCDF files

Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but you for a terabyte of files, that may not be an option. The HTAP demo provider is an example how to do just this. Look at the HTAP_wcs.py and HTAP_config.py.

To try it out yourself, download custom-netcdf-1.2.0.zip It contains folders GEMAQ-v1p0, GEOSChem-v45 and web. web contains the code, other folders contain the data. Copy these folders in /OWS so that the "web" folder is copied on top of the existing "web". Under web/static, HTAP is a custom grid demo, CIRA is the point demo, irrelevant here.

The custom handler does three things.

  • It gets the datetime from the TimeSequence parameter. Time ranges are reported as error.
  • It finds the daily netcdf file by the datetime.
  • It redirects the extractor to get the subcube from that file

To achieve this it's using standard object-oriented inheritance and method overriding.