Difference between revisions of "WCS Wrapper Configuration for Cubes"
(7 intermediate revisions by the same user not shown) | |||
Line 4: | Line 4: | ||
[http://sourceforge.net/p/aq-ogc-services/home/ Project on SourceForge] | [http://sourceforge.net/p/aq-ogc-services/home/ Project on SourceForge] | ||
− | |||
− | |||
== Serving Data from complete NetCDF-CF files == | == Serving Data from complete NetCDF-CF files == | ||
Line 11: | Line 9: | ||
Read [[Creating_NetCDF_CF_Files| How to pack your data to NetCDF Files]]. | Read [[Creating_NetCDF_CF_Files| How to pack your data to NetCDF Files]]. | ||
− | The best thing is to have all your heterogeneous data in one NetCDF file. In that case, your data shares all the main dimensions: Latitude, Longitude | + | The best thing is to have all your heterogeneous data in one NetCDF file. In that case, your data shares all the main dimensions: Latitude, Longitude, Time, Elevation, Wavelength etc... This enables the WCS service to filter out '''map slices''', '''time series''' or cubes of any combination of the dimensions. |
− | Copy your NetCDF file to web/static/myproject | + | Copy your NetCDF file to web/static/myproject |
− | * Every NetCDF file becomes a coverage. | + | * Every NetCDF file becomes a coverage. |
− | * Every Variable in the file becomes a field. | + | * Every Variable in the file becomes a field. |
The configuration is automatic, /OWS/web/'''owsadmin.py''' script extracts the metadata from these cubes and you are ready to run. | The configuration is automatic, /OWS/web/'''owsadmin.py''' script extracts the metadata from these cubes and you are ready to run. | ||
Line 26: | Line 24: | ||
== Serving data from periodic collection of NetCDF files == | == Serving data from periodic collection of NetCDF files == | ||
− | Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but | + | Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but for a terabyte of files, that may not be an option. The HTAP demo provider is an example how to do just this. Look at the [http://128.252.202.19:8080/static/HTAP/HTAP_wcs.py HTAP_wcs.py] and [http://128.252.202.19:8080/static/HTAP/HTAP_config.py HTAP_config.py]. |
− | + | To try it out yourself, open the [https://sourceforge.net/projects/aq-ogc-services/files_beta/ download page] in another tab and get '''custom-netcdf-1.2.3.zip or later. It contains folders GEMAQ-v1p0, GEOSChem-v45 and web. web contains the code, other folders contain the data. Copy these folders in /OWS so that the "web" folder is copied on top of the existing "web". Under web/static, HTAP is a custom grid demo, CIRA is the point demo, irrelevant here. | |
The custom handler does three things. | The custom handler does three things. |
Latest revision as of 12:04, October 8, 2010
Back to WCS Wrapper Configuration
Serving Data from complete NetCDF-CF files
Read How to pack your data to NetCDF Files.
The best thing is to have all your heterogeneous data in one NetCDF file. In that case, your data shares all the main dimensions: Latitude, Longitude, Time, Elevation, Wavelength etc... This enables the WCS service to filter out map slices, time series or cubes of any combination of the dimensions.
Copy your NetCDF file to web/static/myproject
- Every NetCDF file becomes a coverage.
- Every Variable in the file becomes a field.
The configuration is automatic, /OWS/web/owsadmin.py script extracts the metadata from these cubes and you are ready to run.
python owsadmin.py wcs_prepare -ao
This extracts all the metadata from all the providers.
Serving data from periodic collection of NetCDF files
Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but for a terabyte of files, that may not be an option. The HTAP demo provider is an example how to do just this. Look at the HTAP_wcs.py and HTAP_config.py.
To try it out yourself, open the download page in another tab and get custom-netcdf-1.2.3.zip or later. It contains folders GEMAQ-v1p0, GEOSChem-v45 and web. web contains the code, other folders contain the data. Copy these folders in /OWS so that the "web" folder is copied on top of the existing "web". Under web/static, HTAP is a custom grid demo, CIRA is the point demo, irrelevant here.
The custom handler does three things.
- It gets the datetime from the TimeSequence parameter. Time ranges are reported as error.
- It finds the daily netcdf file by the datetime.
- It redirects the extractor to get the subcube from that file
To achieve this it's using standard object-oriented inheritance and method overriding.