Difference between revisions of "Custom Coverage Processor for Cubes"

From Earth Science Information Partners (ESIP)
Line 1: Line 1:
 
{{WCS Glossary
 
{{WCS Glossary
 
|TermDesc=A custom module for cube data storage that is not supported directly.
 
|TermDesc=A custom module for cube data storage that is not supported directly.
|ExampleURL=http://128.252.202.19:8080/static/HTAP/HTAP_wcs.py
+
|Links=[http://128.252.202.19:8080/static/HTAP/HTAP_wcs.py HTAP custom processor]
 
}}
 
}}
 
Example: [http://128.252.202.19:8080/static/HTAP/HTAP_wcs.py HTAP_wcs.py custom WCS module]
 
  
 
The HTAP demo service stores data in daily netCDF-CF files. The custom module must:
 
The HTAP demo service stores data in daily netCDF-CF files. The custom module must:

Revision as of 15:01, September 2, 2010

< Back to Glossary | Edit with Form

Custom_Coverage_Processor_for_Cubes Description: A custom module for cube data storage that is not supported directly.

Glossary Domain: {{{Glossary Domain}}}"{{{Glossary Domain}}}" is not in the list (WCS, HTAP, AQInfrastructure) of allowed values for the "Glossary Domain" property.

Related Links

Links to this page
[[Links::HTAP custom processor]]

Contributors

No Contributors

History

No History Available

Term Details


The HTAP demo service stores data in daily netCDF-CF files. The custom module must:

  • Check, that only one datetime in the TimeSequence is used, because getting timeseries from separate files is not supported.
  • Locate the correct file. Normally, the file is just the coverage identifier and '.nc' extension. In this case the template is GEMAQ-v1p0_SR1_sfc_%(year)s_%(doy)s.nc year and doy, Julian day, gets replaced.

The HTAP_wcs.py implements this by inheriting the default netCDF-CF processor and overriding the _input_file method.

   def _input_file(self, query):

Check that the query has exactly one datetime.

       if len(query.time) != 1:
           raise owsutil.OwsError(
               'CustomError',
               'time parameter must match exactly one datetime')

Get the single datetime

       datetimes = query.time.all_datetimes()
       dt = datetimes[0]

Now get the file template from the configuration

       config = self._load_config(query)[query.identifier]
       files = metadata['files']
       ncfile = files['template'] % {'year':str(dt.year), 'doy': str(iso_time.day_of_year(dt)).zfill(3)}

Now we have the filename, get the folder and return the full file path name.

       src_root, last = __file__, 
       while last != 'web':
           src_root, last = os.path.split(src_root)
       return os.path.join(src_root, files['path'], ncfile)

By writing a custom processor, anything can be used as a data source.