Difference between revisions of "Glossary"

From Earth Science Information Partners (ESIP)
m (1 revision imported)
 
(115 intermediate revisions by 5 users not shown)
Line 1: Line 1:
 
[[WCS_Access_to_netCDF_Files| Back to WCS Wrapper]]   
 
[[WCS_Access_to_netCDF_Files| Back to WCS Wrapper]]   
  
Glossary for Common Terms and Standard Names in Datafed WCS Wrapper Framework
+
__TOC__
 +
[[WCS General Intro]]
  
== AQ_uFIND ==
+
[[WCS Server StationPoint Data]], [[:Category:StationPoint]]
  
A front end to GEOSS Clearinghouse. Currently can be used to find WMS services.
+
[[WCS Server Cube Data]] [[:Category:Cube]]
  
Example use: [http://webapps.datafed.net/AQ_uFIND.aspx?datatype=point AQ_uFIND.aspx?datatype=point]
+
[[WCS Mediators and Clients]] [[:Category:Mediator]] or [[:Category:Client]]
  
== Capabilities Processor ==
+
[[AQ Infrastructure]]  
 
 
This component creates the standard XML documents for WCS and WFS services.
 
 
 
It operates on Metadata and data configuration. From Metadata the capabilities document gets Title, Abstract, Keywords, Contact person etc... From data configuration the processor gets full information of each coverage.
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Cubes WCS Wrapper Configuration for Cubes]
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data WCS Wrapper Configuration for Point Data]
 
 
 
== Coverage Processor ==
 
 
 
The Coverage Processor is a component that performs three different activities:
 
 
 
* WCS Query Parser. The syntax is checked and output is binary object of all the query elements.
 
* Subsetter. This component finds the desired coverage and applies filters to read a subset of the data:
 
** Fields: A client querying wind data may be interested in speed and direction, but reject air pressure.
 
** Bounding Box: restrict response to certain geographical area.
 
** Time: Default time, One time, list of times, periodic range of times.
 
** Grid size and interpolation: High resolution data can be interpolated to lower resolution.
 
** By dimension: Select only one or some wavelengths,elevations,locations
 
* Formatter. The binary data is returned in desired format. Currently supported are NetCDF-CF for cubes and CSV, Comma Separated Values for points.
 
 
 
NetCDF-CF based processor is completely generic for any compatible netCDF-CF file.
 
 
 
SQL processors can be either configured to supported DB schema types, or custom written for esoteric databases.
 
 
 
By writing a custom processor, anything can be used as a data source.
 
 
 
== Cube Data Configuration ==
 
 
 
For standard netCDF-CF files, the configuration is automatic. Each file becomes a coverage, and each variable becomes a field. This is by far the easiest way to create a WCS service. Examples are [http://128.252.202.19:8080/static/testprovider/index.html testprovider] which comes with the installation package, and
 
[http://128.252.202.19:8080/static/NASA/index.html NASA] which serves some datasets downloaded from NASA.
 
 
 
For daily netCDF-CF files it is possible to create a service without compiling them into single file. See [http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Cubes#Serving_data_from_periodic_collection_of_NetCDF_files Serving data from periodic collection of NetCDF files] as an example.
 
 
 
By creating a custom handler, it is possible to store data anywhere.
 
 
 
== Datafed Browser ==
 
 
 
TODO: describe classic browser
 
 
 
TODO: describe browsing WCS without df catalog http://webapps.datafed.net/datafed.aspx?wcs=http://128.252.202.19:8080/CIRA&coverage=VIEWS&param_abbr=SO4f
 
 
 
TODO: Describe GE Plugin browser
 
 
 
== Feature Processor ==
 
 
 
Web Feature Service, WFS, is good in publishing geographic information that does not change by time.
 
 
 
With datafed WCS it is used to publish the location table for point data, because WCS DescribeCoverage Document does not support such rich dimensions well and location tables are static geographic information.
 
 
 
The component that performs three different activities:
 
 
 
* WFS Query Parser. The syntax is checked and output is binary object of all the query elements.
 
 
 
* Subsetter.
 
** Each field may have different location table. If a the data is sparse, some fields have data only in a few locations, it makes sense to return only those locations.
 
** Locations may also be filtered by geographic bounding box.
 
** Other WFS filters are not implemented.
 
 
 
* Formatter. The data is returned in desired format. Currently the only supported is CSV, Comma Separated Values
 
 
 
== GEOSS Clearinghouse ==
 
 
 
The Clearinghouse is a component in the [http://www.earthobservations.org/gci_gci.shtml GEOSS Common Infrastructure]. One of it's functions is the [http://www.earthobservations.org/gci_cr.shtml GEOSS Components and Services Registry]
 
 
 
== Google Earth ==
 
 
 
http://earth.google.com/
 
 
 
TODO: describe standalone and plugin
 
 
 
describe images and static points
 
 
 
describe dynamic points
 
 
 
== ISO 19115 Metadata ==
 
 
 
Description of a service, with strictly defined XML presentation. Contains service URL's and metadata about the service.
 
 
 
== ISO 19115 Maker ==
 
 
 
A public service to create an ISO 19115 record from a WCS or WMS service.
 
 
 
If the Capabilities document contains necessary keywords, the document can be created automatically:
 
[http://webapps.datafed.net/geoss.wsfl?from_wms=http:%2f%2fwebapps.datafed.net%2fAIRNOW.ogc%3fservice%3dWMS%26version%3d1.1.1%26request%3dGetCapabilities&layer=pmfine ISO 19115 for AIRNOW pmfine WMS].
 
 
 
Without keywords in the URL, the metadata can be passed via URL parameters.
 
 
 
== KML Keyhole Markup Language ==
 
 
 
KML is the way to describe content in Google Earth and Google Maps. [http://code.google.com/apis/kml/documentation/index.html KML documentation] is hosted by google.
 
 
 
== KML Maker ==
 
 
 
Datafed tools produce KML directly out of data, which can be produced with WCS or WMS services.
 
 
 
[http://webapps.datafed.net/ge.aspx?dataset_abbr=cov_54456&view_id=map&field=SO4f&param_abbr=SO4f&scale_max=5&view_scale=0.25&kml_header=http%3a%2f%2fwebapps.datafed.net%2fviews%2fconsoles%2fpnt_hdr.xml&pr=static&time_min=2006-08-03&time_max=2006-09-01&time_step=P3D KML from a CIRA/VIEWS showing SO4f] and [http://webapps.datafed.net/loop.wsfl?init=cgi.wsfl%3fview_state%3dconsoles%252fpnt_hdr%26dataset_abbr%3dcov_54456%26param_abbr%3dSO4f%26name%3dcov_54456.SO4f%252b2006-08-03%252b-%252b2006-09-01%26export_filename%3dcov_54456.SO4f.kmz%26export_format%3dkmz&calc=cgi.wsfl%3fdataset_abbr%3dcov_54456%26view_id%3dmap%26export_type%3dpoint%26ignore_cache%3dtrue&accu=kml_merge_timeseries.wsfl&done=mark_kml_header.wsfl%3fbacklink%3dhttp%253a%252f%252fwebapps.datafed.net%252fge.aspx%253fdataset_abbr%253dcov_54456%2526view_id%253dmap%2526field%253dSO4f%2526param_abbr%253dSO4f%2526scale_max%253d5%2526view_scale%253d0.25%2526kml_header%253dhttp%25253a%25252f%25252fwebapps.datafed.net%25252fviews%25252fconsoles%25252fpnt_hdr.xml%2526pr%253dstatic%2526time_min%253d2006-08-03%2526time_max%253d2006-09-01%2526time_step%253dP3D%2526kmz_file%253d%26agg_limit%3d50%26agg_oper%3davg%26cursor_visible%3dfalse%26dataset_abbr%3dcov_54456%26days_of_week_filter%3dMon%2bTue%2bWed%2bThu%2bFri%2bSat%2bSun%26field%3dSO4f%26height%3d500%26hours_of_day_filter%3d0%2b1%2b2%2b3%2b4%2b5%2b6%2b7%2b8%2b9%2b10%2b11%2b12%2b13%2b14%2b15%2b16%2b17%2b18%2b19%2b20%2b21%2b22%2b23%26julian_after%3d5%26julian_before%3d-5%26julian_on%3doff%26lat%3d38.5%26lat_cursor_size%3d2%26lat_max%3d52%26lat_min%3d25%26loc_code%3d010730023%26lon%3d-96.0%26lon_cursor_size%3d2%26lon_max%3d-65%26lon_min%3d-127%26margins%3dfalse%26months_filter%3dTTT-TTT-TTT-TTT%26num_levels%3d6%26param_abbr%3dSO4f%26scale_max%3d5%26scale_min%3d0%26scale_mode%3dlinear%26symbol_height%3d15%26symbol_width%3d15%26view_scale%3d0.25%26width%3d900&pr=static&time_range=2006-08-03/2006-09-01/P3D&dataset_abbr=cov_54456&view_id=map&field=SO4f&height=500&lat_max=52&lat_min=25&lon_max=-65&lon_min=-127&param_abbr=SO4f&scale_max=5&view_scale=0.25&width=900&name=cov_54456.SO4f+2006-08-03+-+2006-09-01&ovl_name=cov_54456.SO4f:%20$(datetime) direct link]
 
 
 
[http://webapps.datafed.net/wms_ge.aspx?server=http://gdata1.sci.gsfc.nasa.gov/daac-bin/G3/giovanni-wms.cgi&time_min=2010-05-01&time_max=2010-05-05 KML from NASA giovanni WMS] and [http://webapps.datafed.net/loop.wsfl?init=cgi.wsfl%3fview_state%3dARC%2fKMZ_timeseries_header%26name%3dAE_DyOcn.002%253a%253aHigh_res_cloud%2b2010-05-01%2b-%2b2010-05-05%26export_filename%3dAE_DyOcn.002%253a%253aHigh_res_cloud.kmz&calc=http%3a%2f%2fgdata1.sci.gsfc.nasa.gov%2fdaac-bin%2fG3%2fgiovanni-wms.cgi%3fservice%3dWMS%26request%3dGetMap%26version%3d1.1.1%26srs%3dEPSG%3a4326%26layers%3dAE_DyOcn.002%3a%3aHigh_res_cloud%26time%3d%24(datetime)%26bbox%3d-180%2c-90%2c180%2c90%26width%3d800%26height%3d600%26bgcolor%3d0xFFFFFF%26transparent%3dFALSE%26exceptions%3dapplication%2fvnd.ogc.se_xml%26styles%3d%26format%3dimage%2fgif&accu=kml_merge_timeseries.wsfl&done=mark_kml_header.wsfl%3fbacklink%3dhttp%253a%252f%252fwebapps.datafed.net%252fwms_ge.aspx%253fkml_header%253dARC%25252fKMZ_timeseries_header.xml%2526wms%253dhttp%25253a%25252f%25252fgdata1.sci.gsfc.nasa.gov%25252fdaac-bin%25252fG3%25252fgiovanni-wms.cgi%25253fservice%25253dWMS%252526request%25253dGetMap%252526version%25253d1.1.1%252526srs%25253dEPSG%25253a4326%252526layers%25253dAE_DyOcn.002%25253a%25253aHigh_res_cloud%252526time%25253d2010-05-01%252526bbox%25253d-180%25252c-90%25252c180%25252c90%252526width%25253d800%252526height%25253d600%252526bgcolor%25253d0xFFFFFF%252526transparent%25253dFALSE%252526exceptions%25253dapplication%25252fvnd.ogc.se_xml%252526styles%25253d%252526format%25253dimage%25252fgif%2526time_min%253d2010-05-01%2526time_max%253d2010-05-05%2526time_step%253dP1D%2526kmz_file%253d&time_range=2010-05-01/2010-05-05/P1D&time_min=2010-05-01&time_max=2010-05-05&lat_min=-90&lat_max=90&lon_min=-180&lon_max=180&bgcolor=0xFFFFFF&name=AE_DyOcn.002%3a%3aHigh_res_cloud+2010-05-01+-+2010-05-05&ovl_name=AE_DyOcn.002%3a%3aHigh_res_cloud:%20$(datetime) direct link]
 
 
 
Precompiled examples:
 
 
 
[http://webapps.datafed.net/views/test/point_demo_20100827.kmz Point Demo]
 
 
 
[http://webapps.datafed.net/views/test/image_demo_20100827.kmz Gridded Demo]
 
 
 
== Location Table ==
 
 
 
The location table describes the location dimension for point data.
 
 
 
The fields that datafed uses are:
 
 
 
* Mandatory fields:
 
 
 
** loc_code: A unique text field, used to identify a location.
 
** lat: Latitude of the location in degrees_north
 
** Lon: Elevation of the location in degrees_east
 
 
 
* Optional datafed fields:
 
 
 
** loc_name: Reasonably short text describing location.
 
** elev: elevation in meters.
 
 
 
* data specific fields:
 
** Any field with any name
 
 
 
Good loc_codes are short abbreviations like ACAD and YOSE for Acadia and Yosemite National Parks. Completely numeric loc codes are possible, but more difficult to recognize and since leading zeros are significant, tools like excel may think they're numbers and cut them off.
 
 
 
If the loc_codes are long, like 9 characters, it's useful to generate a numeric 16-bit primary key for the location table and use it for joining the data tables with the location table. This may help in indexing and speed things up quite a bit.
 
 
 
Example: [http://128.252.202.19:8080/CIRA?service=WFS&version=1.0.0&request=GetFeature&typename=SO4f&outputformat=text/csv CIRA/VIEWS location table]
 
 
 
== Metadata ==
 
 
 
Abstract, Contact Information, Keywords and any other such documentation that is needed in classifying or finding the service. The metadata is accessible for the user via capabilities and coverage description documents.
 
 
 
Every provider should have '''wcs_capabilities.conf''' that lists keywords and contact information. The format is simple, copy one from the [http://128.252.202.19:8080/static/testprovider/wcs_capabilities.conf testprovider] and edit it.
 
 
 
    # this file provides some information about the provider
 
    # and is incorporated into the respective WCS responses.
 
    # all currently available field identifiers are listed below.
 
    # please define every identifier only once.
 
    # other identifiers will be ignored, input is case sensitive.
 
    # the format is always <identifier>: <value>.
 
    # whitespaces before and after <value> will be stripped.
 
    # KEYWORDS can take a comma separated list that will then be
 
    # included in the respective keyword tags
 
    # empty lines and lines starting with "#" will be ignored.
 
    PROVIDER_TITLE: National Climate Data Center
 
    PROVIDER_ABSTRACT: National Climate Data Center is the worlds largest archive of climate data.
 
    KEYWORDS: Domain:Aerosol, Platform:Network, Instrument:Unknown, DataType:Point, Distributor:DataFed, Originator:NCDC, TimeRes:Minute, Vertical:Surface, TopicCategory:climatologyMeteorologyAtmosphere
 
    FEES: NONE
 
    CONSTRAINTS: NONE
 
    PROVIDER_SITE: http://lwf.ncdc.noaa.gov/oa/ncdc.html
 
    CONTACT_INDIVIDUAL: Climate Contact, Climate Services Branch, National Climatic Data Center
 
    CONTACT_PHONE: 828-271-4800
 
    CONTACT_STREET: 151 Patton Avenue Room 468
 
    CONTACT_CITY: Asheville
 
    CONTACT_ADM_AREA: North Carolina
 
    CONTACT_POSTCODE: 28801-5001
 
    CONTACT_COUNTRY: USA
 
    CONTACT_EMAIL: ncdc.info@noaa.gov
 
 
 
Here is the real live NCDC [http://128.252.202.19:8080/static/NCDC/wcs_capabilities.conf wcs_capabilities.conf]
 
 
 
== NetCDF-CF ==
 
 
 
NetCDF file format contains four kinds of information:
 
 
 
* Global attributes
 
** Simple name=value pairs
 
 
 
* Dimensions
 
** Only declares the length of the dimension
 
** Contains no dimension data.
 
 
 
* Variables
 
** Array data with any number of dimensions.
 
** Zero dimensions meaning scalar data.
 
 
 
* Variable Attributes:
 
** Simple name=value pairs associated to a variable.
 
 
 
While these are enough to describe any data, it's not easy to interpret what the data actually means. What is self-evident for humans is difficult for a computer program to reason. If you have a NetCDF viewer, it should be possible just open the file and display the data on a geographic map. But making a program that can automatically get the geographic dimensions from a NC file, is very difficult.
 
 
 
Conventions come to rescue. CF-1.0 Standardizes many things:
 
 
 
* Standard name: what is the measure data about
 
 
 
* Units
 
 
 
* How to tell, that a variable is one of the following:
 
** Data Variable, containing real data.
 
** Dimension Coordinate Variable, containing dimension coordinates.
 
** Dimension Bounds Variable, containing lower and upper bounds of a dimension coordinate.
 
 
 
* Projection
 
 
 
CF 1.0 - [http://www.unidata.ucar.edu/software/netcdf/conventions.html 1.4] contain conventions for cube data.
 
 
 
[https://cf-pcmdi.llnl.gov/trac/wiki/PointObservationConventions Unofficial CF-1.5] contains point data encoding. '''Expired certificate''', add security exception.
 
 
 
Links:
 
[http://www.unidata.ucar.edu/software/netcdf/ Unidata NetCDF documentation]
 
 
 
[http://www.unidata.ucar.edu/software/netcdf/conventions.html NetCDF Conventions]
 
 
 
[http://www.unidata.ucar.edu/software/netcdf/conventions.html CF Conventions (Recommended, if applicable)]
 
 
 
[http://wiki.esipfed.org/index.php/Creating_NetCDF_CF_Files Creating NetCDF CF Files]
 
 
 
== Point Data Configuration ==
 
 
 
Programmed instructions for the framework how to access data. This includes but is not limited to
 
 
 
* For Cube Coverages:
 
** Automatic: Information of variables and dimensions extracted from netCDF-CF files.
 
** Manual: Hand-edited python dictionaries describing the netCDF files, their variables and dimensions.
 
 
 
 
 
* For Point Coverages:
 
** Hand-edited python dictionaries describing Names and Columns of Location and Data tables.
 
** Custom modules for databases that are too esoteric configure in pure declarative manner.
 
 
 
Cube:
 
[[WCS_Wrapper_Configuration_for_Cubes| Configuring NetCDF based Cube Data]]
 
 
 
 
 
Point:
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data#Location_Table_Configuration Location Table Configuration]
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data#Data_Table_Configuration Data Table Configuration]
 
 
 
== Point Location Configuration ==
 
 
 
Programmed instructions for the framework how to access data. This includes but is not limited to
 
 
 
* For Cube Coverages:
 
** Automatic: Information of variables and dimensions extracted from netCDF-CF files.
 
** Manual: Hand-edited python dictionaries describing the netCDF files, their variables and dimensions.
 
 
 
 
 
* For Point Coverages:
 
** Hand-edited python dictionaries describing Names and Columns of Location and Data tables.
 
** Custom modules for databases that are too esoteric configure in pure declarative manner.
 
 
 
Cube:
 
[[WCS_Wrapper_Configuration_for_Cubes| Configuring NetCDF based Cube Data]]
 
 
 
 
 
Point:
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data#Location_Table_Configuration Location Table Configuration]
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data#Data_Table_Configuration Data Table Configuration]
 
 
 
== SQL Database for Points ==
 
 
 
Currently the datafed WCS for points supports one kind of point data: Fixed locations and regular intervals.
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data#Storing_Point_Data_in_a_Relational_Database Storing Point Data in a Relational Database]
 
 
 
== WCS Capabilities Document ==
 
 
 
The document contains all the high level information about a service
 
 
 
The Document contains:
 
* Description of the Service
 
* Machine Readable and Human Readable Name.
 
* Keywords
 
* Contact Information
 
* HTTP access information
 
* List of coverages in the service
 
** Machine Readable and Human Readable Name.
 
** Keywords
 
** Latitude and Longitude bounds.
 
** Time range in version 1.0.0
 
 
 
[http://128.252.202.19:8080/NASA?service=WCS&acceptversions=1.1.2&Request=GetCapabilities Example Version 1.1.2]
 
 
 
[http://webapps.datafed.net/OMAERO_G.ogc?SERVICE=WCS&REQUEST=GetCapabilities&VERSION=1.0.0 Example Version 1.0.0]
 
 
 
== WCS Describe Coverage Document ==
 
 
 
The document describes the coverage in detail, so that the user knows what the data is and what are the dimensions of the data.
 
 
 
* Description of the Coverage
 
* Machine Readable and Human Readable Name.
 
* Keywords
 
* Latitude and Longitude bounds.
 
* Grid bounds in the projection of the data, if applicable
 
* Grid size in the projection of the data, if applicable
 
* Time dimension.
 
* Supported Coordinate Systems
 
* Supported Formats
 
* Supported Interpolations
 
* Fields of Coverage in versions 1.1.x
 
** Name
 
** Units
 
** Other dimensions, like elevation or wavelength, if applicable
 
** Reference to location dimension, if applicable
 
 
 
[http://128.252.202.19:8080/NASA?service=WCS&version=1.1.2&Request=DescribeCoverage&identifiers=modis4 Example Version 1.1.2]
 
 
 
[http://webapps.datafed.net/OMAERO_G.ogc?VERSION=1.0.0&SERVICE=WCS&REQUEST=DescribeCoverage&Coverage=AOTMW Example Version 1.0.0]
 
 
 
== WCS GetCoverage Query ==
 
 
 
The main query to get data from a WCS
 
 
 
[http://wiki.esipfed.org/index.php/WCS_Wrapper_Configuration_for_Point_Data#GetCoverage GetCoverage for points]
 
 
 
TODO: samples
 
 
 
== WFS Capabilities Document ==
 
 
 
The document contains all the high level information about a service
 
 
 
TODO: samples
 
 
 
== WFS DescribeFeatureType ==
 
 
 
[http://128.252.202.19:8080/NCDC?Service=WFS&version=1.0.0&request=DescribeFeatureType&typename=ASOS ASOS]
 
The document contains all the high level information about a service
 
 
 
TODO: samples
 
 
 
 
 
== WFS GetFeature Query ==
 
 
 
The main query to get data from a WFS
 
  
 +
Glossary for Common Terms and Standard Names in Datafed WCS Wrapper Framework
 +
This is the 'WCS Glossary' form. To add a page with this form, enter the page name below; if a page with that name already exists, you will be sent to a form to edit that page.
 +
{{#forminput:Glossary}}
  
TODO: samples
+
{{#ask: [[TermDesc::+]][[Category:Glossary]]
[http://128.252.202.19:8080/NCDC?Service=WFS&version=1.0.0&request=GetFeature&typename=ASOS&outputformat=text/csv sample]
+
|mainlabel=Term
 +
|?TermDesc=Description
 +
|?Glossary Domain=Tags
 +
|limit=500
 +
|order=asc
 +
}}

Latest revision as of 09:53, October 8, 2021

Back to WCS Wrapper


WCS General Intro

WCS Server StationPoint Data, Category:StationPoint

WCS Server Cube Data Category:Cube

WCS Mediators and Clients Category:Mediator or Category:Client

AQ Infrastructure

Glossary for Common Terms and Standard Names in Datafed WCS Wrapper Framework This is the 'WCS Glossary' form. To add a page with this form, enter the page name below; if a page with that name already exists, you will be sent to a form to edit that page.


TermDescriptionTags
AQ Community CatalogThe AQ Community Catalog is theAQInfrastructure
Coverage Processor for PointsThis component reads a subset of Station Point Data out of an SQL database. It gets the requested coverage from the query and gets the relevant SQL table and column names from the configuration information. Then it creates an SQL query according to filters, which is executed in the SQL database engine, and the resulting rows are written to a CSV file. Used in WCS GetCoverage Query.WCS
Datafed BrowserDatafed Browser is a web program for viewing WCS and WMS services.WCS
AQInfrastructure
GEOSS ClearinghouseThe GEOSS Clearinghouse is the engine that drives the entire system. It connects directly to the various GEOSS components and services, collects and searches their information and distributes data and services. The AQ Community records can be found in the clearinghouse and aq applications, like uFIND, search the clearinghouse.WCS
AQInfrastructure
Google EarthGoogle Earth lets you fly anywhere on Earth to view satellite imagery, maps, terrain, 3D buildings, from galaxies in outer space to the canyons of the ocean. You can explore rich geographical content, save your toured places, and share with others.WCS
ISO 19115 Metadata for Air QualityISO 19115 is the metadata standard for Geographic information. The AQ Community chose ISO 19115 because it is a standard accepted by GEOSS and being implemented widely around the world. It includes the ISO 19115 Core, 19119 metadata for describing the geospatial services and AQ-specific metadata for finding datasets.WCS
HTAP
AQInfrastructure
KML Keyhole Markup LanguageKML is a file format used to display geographic data in an Earth browser such as Google Earth, Google Maps, and Google Maps for mobile. KML uses a tag-based structure with nested elements and attributes and is based on the XML standard. All tags are case-sensitive and must be appear exactly as they are listed in the KML Reference. The Reference indicates which tags are optional. Within a given element, tags must appear in the order shown in the Reference.WCS
Python DictionariesPython is a simple programming language. It contains the dictionary datatype, which is a simple key=value hierarchical database. Due to simple syntax, the datafed WCS uses it as a configuration file, instead of xml files or something similar.WCS
WCS StandardTerm DescriptionWCS
WCS-CF-netCDF MatchingTerm DescriptionWCS
HTAP