Difference between revisions of "WCS Wrapper Configuration"

From Earth Science Information Partners (ESIP)
 
(40 intermediate revisions by the same user not shown)
Line 3: Line 3:
 
[http://sourceforge.net/p/aq-ogc-services/home/ Project on SourceForge]
 
[http://sourceforge.net/p/aq-ogc-services/home/ Project on SourceForge]
  
Questions and comments should go to [http://sourceforge.net/p/aq-ogc-services/discussion/ sourceforge discussions], bug reports to [http://sourceforge.net/p/aq-ogc-services/tickets/ sourceforge tickets]. Urgent issues can be asked from Kari Hoijarvi 314-935-6099(w) or 314-843-6436(h)
+
This page describes how to install the information for humans: Home pages, contact information etc.  
  
Last updated 2010-07-21
+
For data Configuration go to following pages:
 +
 
 +
* [[WCS_Wrapper_Configuration_for_Cubes| Configuring NetCDF based Cube Data]]
 +
 
 +
* [[WCS_Wrapper_Configuration_for_Point_Data| Configuring SQL based Point Data]]
  
 
== Structure of OWS/web ==
 
== Structure of OWS/web ==
Line 11: Line 15:
 
'''OWS/web''' is for system developers only.
 
'''OWS/web''' is for system developers only.
  
OWS/web/'''static''' contains static web content. You can put any documentation here and it will be served as a web page or download. The home page index.html is pretty much mandatory, and you shoud change favicon.ico to reflect your organization. We highly recommend, that you customize these to document your WCS service.
+
OWS/web/'''static''' contains static web content. You can put any documentation here and it will be served as a web page or download. The home page '''index.html''' is pretty much mandatory, and you should change '''favicon.ico''' to reflect your organization. We highly recommend, that you customize documents for your WCS service.
  
 
OWS/web/static/'''cache''' is a folder for temporary files. The service uses it for output files. Anything you put there will be deleted when space is needed.
 
OWS/web/static/'''cache''' is a folder for temporary files. The service uses it for output files. Anything you put there will be deleted when space is needed.
  
The installation contains an example datasets OWS/web/static/'''testprovider''' and OWS/web/static/'''point'''. The testprovider is a demo NetCDF dataset, point is an example how to server point data from a SQL database. Every service will have a folder with the same name here.
+
The installation contains an example datasets OWS/web/static/'''testprovider''' and OWS/web/static/'''point'''. The testprovider is a demo NetCDF dataset, point is an example how to server point data from a SQL database. Every service will have a folder with the same name here. So create yourself a folder datasets OWS/web/static/'''myservice''' and your service URL will be at http://localhost:8080/myservice
 
 
You may now check the provider page [http://localhost:8080/testprovider http://localhost:8080/testprovider] which is served as a static file. Any file under static becomes accessible
 
  
 
== The Human Interface: Create the index.html Front Pages for Visitors. ==
 
== The Human Interface: Create the index.html Front Pages for Visitors. ==
  
If no query is present, the server gives a default page '''index.html'''. You should provide pages for your server and for all the providers.  
+
If no query is present, the server gives a default page '''index.html'''. You should create these pages for your server and for all the providers.  
  
The server index.html is at '''OWS/web/static/index.html''', which will be displayed from url http://localhost:8080/, Index of an external server index.html is [http://128.252.202.19:8080/ here].  
+
The server index.html is at '''OWS/web/static/index.html''', which will be displayed from url http://128.252.202.19:8080/ at test server or at [http://localhost:8080/ localhost] if you have a server running.
  
Every provider folder should also have an index.html like '''OWS/web/static/testprovider\index.html''' which will be displayed from http://localhost:8080/testprovider, index of an external provider front page is [http://128.252.202.19:8080/HTAP here]
+
Every provider folder should also have an index.html like '''OWS/web/static/testprovider/index.html''' which will be displayed from http://128.252.202.19:8080/HTAP at test server or at [http://localhost:8080/testprovider localhost].
  
Every provider should have wcs_capabilities.conf that lists keywords and contact information. The format is simple, copy one from the testprovider and edit it.
+
== Enter Provider Metadata ==
  
=== Windows Implementation Bug ===
+
Every provider should have '''wcs_capabilities.conf''' that lists keywords and contact information. The format is simple, copy one from the testprovider and edit it.
'''Important''' There is a bug deep in python core libraries that make serving text files tricky. The files need to be encoded with unix style line ending convention '\n', instead of windows style '\r\n'.  
 
  
To fix this, issue command:
+
    # this file provides some information about the provider
 +
    # and is incorporated into the respective WCS responses.
 +
    # all currently available field identifiers are listed below.
 +
    # please define every identifier only once.
 +
    # other identifiers will be ignored, input is case sensitive.
 +
    # the format is always <identifier>: <value>.
 +
    # whitespaces before and after <value> will be stripped.
 +
    # KEYWORDS can take a comma separated list that will then be
 +
    # included in the respective keyword tags
 +
    # empty lines and lines starting with "#" will be ignored.
 +
    PROVIDER_TITLE: National Climate Data Center
 +
    PROVIDER_ABSTRACT: National Climate Data Center is the worlds largest archive of climate data.
 +
    KEYWORDS: Domain:Aerosol, Platform:Network, Instrument:Unknown, DataType:Point, Distributor:DataFed, Originator:NCDC, TimeRes:Minute, Vertical:Surface, TopicCategory:climatologyMeteorologyAtmosphere
 +
    FEES: NONE
 +
    CONSTRAINTS: NONE
 +
    PROVIDER_SITE: http://lwf.ncdc.noaa.gov/oa/ncdc.html
 +
    CONTACT_INDIVIDUAL: Climate Contact, Climate Services Branch, National Climatic Data Center
 +
    CONTACT_PHONE: 828-271-4800
 +
    CONTACT_STREET: 151 Patton Avenue Room 468
 +
    CONTACT_CITY: Asheville
 +
    CONTACT_ADM_AREA: North Carolina
 +
    CONTACT_POSTCODE: 28801-5001
 +
    CONTACT_COUNTRY: USA
 +
    CONTACT_EMAIL: ncdc.info@noaa.gov
  
    python /OWS/web/owsadmin.py unix_nl filename.html
+
Here is the real live NCDC [http://128.252.202.19:8080/static/NCDC/wcs_capabilities.conf wcs_capabilities.conf]
  
for every html file you serve.
+
== Windows Implementation Issue ==
 +
'''Important''' There is a bug deep in python core libraries that make serving text files a little special. The files need to be encoded with unix style line ending convention '\n', instead of windows style '\r\n'.  
  
== Serving data from periodic collection of NetCDF files ==
+
To fix this, issue command:
  
Sometimes you have accumulated a huge number of small NetCDF files, like daily slices from a model output. You could combine those into one big cube, but you for a terabyte of files, that may not be an option.
+
     python /OWS/web/owsadmin.py unix_nl filename.html
 
 
Download our HTAP test package [http://sourceforge.net/downloads/aq-ogc-services/ows/custom-netcdf-1.2.0.zip/ custom-netcdf-1.2.0.zip]. It only has two days of data to make download small. Then read the [http://localhost:8080/HTAP custom provider page]
 
 
 
== Storing Point Data in a Relational Database ==
 
 
 
Provider [http://localhost:8080/point point] is an example how to configure this service to use SQL database to serve point data.
 
 
 
Point data is often stored in SQL databases. There's no standard schema like CF-1.0 convention for NetCDF files, so it is not possible to just connect and start serving. You have to create the configuration description.
 
 
 
Therefore, the WCS query processor needs to know what to select and join. This information must be edited into the configuration script.
 
 
 
=== Notes on SQL ===
 
 
 
One of the most powerful ideas in relational database design is the concept of a view. You don't need to change the existing data tables, creating a view that makes your DB to look like the one needed is usually enough. This is by far the easiest way to configure your WCS.
 
 
 
It is better to design a normalized schema and only optimize with benchmarks available. Especially filtering small lat/lon ranges is much more efficient to do on a normalized location table rather than denormalized data table.
 
 
 
=== Location Table ===
 
 
 
The common thing between different databases is, that they need to have a location table.
 
 
 
    table location
 
    +----------+-------------------------+
 
    | loc_code | lat  | lon    | elev  |
 
    +----------+-------------------------+
 
    | KMOD    | 37.63 | -120.95 |  30.0 |
 
    | KSTL    | 38.75 |  -90.37 | 172.0 |
 
    | KUGN    | 42.42 |  -87.87 | 222.0 |
 
    |...      |      |        |      |
 
    +----------+-------------------------+
 
 
 
Here loc_code is the primary key and lat,lon is the location. Optional fields can be added. CIRA VIEWS database has a location table, but it's called '''Site''' and it spells full '''longitude'''. The datafed browser uses standard names loc_code, loc_name, lat and lon for browsing. For plug-and-play compatibility we recommend using these names. In the CIRA VIEWS database, the view creation would be:
 
 
 
    create view location as
 
    select
 
        SiteCode as loc_code,
 
        Latitude as lat,
 
        Longitude as lon
 
    from Site
 
 
 
The primary key is loc_code, being unique for all the locations.
 
 
 
If the fields have different names they can be aliased in the configuration.
 
 
 
== Some Different DB Schema types ==
 
 
 
In this documentation three different schemas are presented. Each of them have good and bad points.
 
 
 
=== One Big Data Table ===
 
 
 
In this case, all the data is in the same table:
 
 
 
    +----------+------------+------+------+------+
 
    | loc_code | datetime  | TEMP | DEWP | VIS  |
 
    +----------+------------+------+------+------+
 
    | KMOD    | 2009-06-01 | 87.8 | 51.4 | 10  |
 
    | KMOD    | 2009-06-02 | 82.3 | 51.4 | NULL |
 
    | KSTL    | 2009-06-01 | 78.6 | 34.9 | 18  |
 
    | ...      |            |      |      |      |
 
    +----------+------------+------+------+------+
 
 
 
The foreign key to location table is loc_code. The primary key is (loc_code, datetime)
 
 
 
'''Strengths:''' Simple, No joining when querying all the fields.
 
 
 
'''Downsides:''' Needs nulls for missing data, querying just one field is inefficient.
 
 
 
=== Long And Skinny Table ===
 
 
 
In this case, all the data is in the same table:
 
 
 
    +----------+------------+------+-------+
 
    | loc_code | datetime  | data | param |
 
    +----------+------------+------+-------+
 
    | KMOD    | 2009-06-01 | 87.8 | TEMP  |
 
    | KMOD    | 2009-06-02 | 82.3 | TEMP  |
 
    | KSTL    | 2009-06-01 | 78.6 | TEMP  |
 
    | KMOD    | 2009-06-01 | 51.4 | DEWP  |
 
    | KMOD    | 2009-06-02 | 51.4 | DEWP  |
 
    | KSTL    | 2009-06-01 | 34.9 | DEWP  |
 
    | KMOD    | 2009-06-01 | 10  | VIS  |
 
    | KMOD    | 2009-06-02 | 10  | VIS  |
 
    | KSTL    | 2009-06-01 | 18  | VIS  |
 
    | ...      |            |      |      |
 
    +----------+------------+------+------+
 
 
 
'''Strengths:''' No nulls, Easy to add fields.
 
 
 
'''Downsides:''' Querying requires extra filtering with parameter index, slower than others.
 
 
 
=== One Data Table For Each Param ===
 
 
 
Each parameter has its own data table. In this case there's no need for nulls, and is the fastest for one parameter query.
 
 
 
    +----------+------------+------+
 
    | loc_code | datetime  | TEMP |
 
    +----------+------------+------+
 
    | KMOD    | 2009-06-01 | 87.8 |
 
    | KMOD    | 2009-06-02 | 82.3 |
 
    | KSTL    | 2009-06-01 | 78.6 |
 
    | ...      |            |      |
 
    +----------+------------+------+
 
 
 
 
 
    +----------+------------+------+
 
    | loc_code | datetime  | DEWP |
 
    +----------+------------+------+
 
    | KMOD    | 2009-06-01 | 51.4 |
 
    | KMOD    | 2009-06-02 | 51.4 |
 
    | KSTL    | 2009-06-01 | 34.9 |
 
    | ...      |            |      |
 
    +----------+------------+------+
 
 
 
 
 
    +----------+------------+-----+
 
    | loc_code | datetime  | VIS |
 
    +----------+------------+-----+
 
    | KMOD    | 2009-06-01 | 10  |
 
    | KMOD    | 2009-06-02 | 10  |
 
    | KSTL    | 2009-06-01 | 18  |
 
    | ...      |            |    |
 
     +----------+------------+-----+
 
 
 
 
 
'''Strengths:''' No nulls, Easy to add tables, easy to add heterogenous flag fields, fastest queries for single parameter.
 
 
 
'''Downsides:''' More tables, querying all the parameters at once requires a massive join.
 
 
 
== Configuring the WCS using SQL Views ==
 
 
 
This is demonstrated in the test provider '''point'''.
 
 
 
The demonstration is using sqlite, which is distributed with python by default. The project has following files:
 
 
 
* '''pntdata.py''': This script creates the test database and fills it with dummy data.
 
* '''pntdata.db''': The sqlite database file created by pntdata.py
 
* '''point_config.py'''
 
** WCS coverage information
 
** Mapping the coverages and fields to SQL tables.
 
* '''point_WCS.py'''
 
** Loads the metadata. In this demo version, this is done by hardcoding the tables in point_config.py. In CIRA/VIEWS this is done by querying the parameter table.
 
** Gets db connection. The metadata mappings allows the service to generate SQL on it's own.
 
 
 
 
 
=== Contents of point_config.py ===
 
 
 
All of these are python dictionaries and lists.
 
 
 
The syntax is simple, This is a list:
 
 
 
    ['loc_code', 'lat', 'lon']
 
 
 
This is a dictionary:
 
 
 
    {'key1':'value1', 'key2': 'value2 }
 
 
 
Since these are just python objects, they can be generated a database as well.
 
 
 
==== Location Table Configuration ====
 
 
 
First, the location table for coverage SURF_MET.
 
 
 
    location_info = {
 
        'location':{
 
 
 
The locations_file is just text in CSV format. '''Notice''': This will be replaced by WFS, Web Feature Service in the near future. Location table is a typical static feature, good for WFS.
 
 
 
            'locations_file':'SURF_MET_locations.csv',
 
 
 
In the '''CIRA/VIEWS''' database, we're not authorized to create a view. So we need to map the 'Site' table and it's columns.
 
 
 
            'table_alias':'Site',
 
            'columns':{
 
                'loc_code':{'column_alias':'SiteCode'},
 
                'lat':{'column_alias':'Latitude'},
 
                'lon':{'column_alias':'Longitude'},
 
                }
 
            },
 
        }
 
 
 
 
 
==== Data Table Configuration using SQL View ====
 
 
 
    point_info = {
 
 
 
First key is the coverage information and it's descriptions:
 
 
 
        'SURF_MET':
 
            {
 
                'Title':'Surface Meteorological Observations',
 
                'Abstract':'Dummy test data.',
 
 
 
The covered area and time. The Time dimension is a true dimension here, but contrary to grid data, the X-Y dimensions for point data are not dimensions, but attributes of the location dimension.
 
 
 
                'axes':{
 
                    'X':(-180, 179.75),
 
                    'Y':(-90, 89.383),
 
                    'T':iso_time.parse('2009-09-01T12:00:00/2009-09-03T12:00:00/PT1H'),
 
                    },
 
 
 
Then comes the description of the fields.  
 
 
 
 
 
                'fields':{
 
                    'TEMP':{
 
                        'Title':'Temperature',
 
                        'datatype': 'float',
 
                        'units':'deg F',
 
 
 
The location table is a real dimension. In this case, the location table is shared, so we use the previously declared variable 'location_info' If the location tables are parameter specific, they can be specified individually.
 
 
 
                        'axes':location_info,
 
 
 
The access instructions. This configuration is using 'complete_view', so the administrator has created the view that joins together the location table and the temperature data table. The SQL query will typically look like '''select loc_code, lat, lon, datetime, temp, flag from TEMP_V where datetime = '2009-09-01 and (lat between 34 and 44) and (lon between -90 and -80)'''. This is by far the easiest way to configure the WCS.
 
 
 
                        'complete_view':{
 
                            'view_alias':'TEMP_V',
 
                            'columns':['loc_code','lat','lon','datetime','temp', 'flag'],
 
                            },
 
                        },
 
 
 
The end of the first field.
 
 
 
 
 
==== Data Table Configuration using by Mapping Original Tables ====
 
 
 
For demonstration purposes, the next field is configured without a view.
 
 
 
                    'DEWP':{
 
                        'Title':'Dewpoint',
 
                        'datatype': 'float',
 
                        'units':'deg F',
 
                        'axes':location_info,
 
 
 
First we tell which table contains the data.
 
 
 
                        'table_alias':'DEWP_base',
 
 
 
In the '''CIRA/VIEWS''' database again, we have to map the original table.
 
 
 
                        'datetime_alias':'FactDate',
 
 
 
We need to join the Site table for locations and since all the data is in the same table, we also need to filter by the parameter code, which now requires joining the parameter table. In the point the SQL processor simply writer 'inner join DEWP_base on DEWP_base.loc_code = location.loc_code', but CIRA/VIEWS uses a generated SiteID as the foreign key, so this must be written explicitly.
 
 
 
                        'joins':(
 
                            'inner join AirFact3 on AirFact3.SiteID = Site.SiteID ' +
 
                            'inner join Parameter on AirFact3.ParamID = Parameter.ParamID'),
 
 
 
The default
 
 
 
 
 
                        'common_data_filter':(
 
                            'AggregationID = 1 and Site.latitude is not null and Site.Longitude is not null ' + 
 
                            'and AirFact3.FactValue <> -999 ' +
 
                            'and AirFact3.ProgramID in (10001, 10005, 20002) -- ('INA', 'IMPPRE', 'ASPD') ' +
 
                            "and Parameter.ParamCode = 'MF'"),
 
                        'data_columns':[
 
                            {'name':'MF', 'column_alias':'FactValue'},
 
                            ],
 
 
 
In this point database, the above is not necessary since fields have defaults.
 
 
 
                        'data_columns':[
 
                            {'name':'dewp'},
 
                            ],
 
                        },
 
  
                    },
+
for every text file you serve. Binary files are unaffected.
            },
 
        }
 

Latest revision as of 11:29, August 27, 2010

Back to WCS Wrapper

Project on SourceForge

This page describes how to install the information for humans: Home pages, contact information etc.

For data Configuration go to following pages:

Structure of OWS/web

OWS/web is for system developers only.

OWS/web/static contains static web content. You can put any documentation here and it will be served as a web page or download. The home page index.html is pretty much mandatory, and you should change favicon.ico to reflect your organization. We highly recommend, that you customize documents for your WCS service.

OWS/web/static/cache is a folder for temporary files. The service uses it for output files. Anything you put there will be deleted when space is needed.

The installation contains an example datasets OWS/web/static/testprovider and OWS/web/static/point. The testprovider is a demo NetCDF dataset, point is an example how to server point data from a SQL database. Every service will have a folder with the same name here. So create yourself a folder datasets OWS/web/static/myservice and your service URL will be at http://localhost:8080/myservice

The Human Interface: Create the index.html Front Pages for Visitors.

If no query is present, the server gives a default page index.html. You should create these pages for your server and for all the providers.

The server index.html is at OWS/web/static/index.html, which will be displayed from url http://128.252.202.19:8080/ at test server or at localhost if you have a server running.

Every provider folder should also have an index.html like OWS/web/static/testprovider/index.html which will be displayed from http://128.252.202.19:8080/HTAP at test server or at localhost.

Enter Provider Metadata

Every provider should have wcs_capabilities.conf that lists keywords and contact information. The format is simple, copy one from the testprovider and edit it.

   # this file provides some information about the provider
   # and is incorporated into the respective WCS responses.
   # all currently available field identifiers are listed below.
   # please define every identifier only once.
   # other identifiers will be ignored, input is case sensitive.
   # the format is always <identifier>: <value>.
   # whitespaces before and after <value> will be stripped.
   # KEYWORDS can take a comma separated list that will then be
   # included in the respective keyword tags
   # empty lines and lines starting with "#" will be ignored.
   PROVIDER_TITLE: National Climate Data Center
   PROVIDER_ABSTRACT: National Climate Data Center is the worlds largest archive of climate data.
   KEYWORDS: Domain:Aerosol, Platform:Network, Instrument:Unknown, DataType:Point, Distributor:DataFed, Originator:NCDC, TimeRes:Minute, Vertical:Surface, TopicCategory:climatologyMeteorologyAtmosphere
   FEES: NONE
   CONSTRAINTS: NONE
   PROVIDER_SITE: http://lwf.ncdc.noaa.gov/oa/ncdc.html
   CONTACT_INDIVIDUAL: Climate Contact, Climate Services Branch, National Climatic Data Center
   CONTACT_PHONE: 828-271-4800
   CONTACT_STREET: 151 Patton Avenue Room 468
   CONTACT_CITY: Asheville 
   CONTACT_ADM_AREA: North Carolina 
   CONTACT_POSTCODE: 28801-5001
   CONTACT_COUNTRY: USA
   CONTACT_EMAIL: ncdc.info@noaa.gov

Here is the real live NCDC wcs_capabilities.conf

Windows Implementation Issue

Important There is a bug deep in python core libraries that make serving text files a little special. The files need to be encoded with unix style line ending convention '\n', instead of windows style '\r\n'.

To fix this, issue command:

   python /OWS/web/owsadmin.py unix_nl filename.html

for every text file you serve. Binary files are unaffected.