HTAP Report, Sub-Chap. 6 - Data/Info System
This wiki page is inteded to be the collaborative workspace for Task Force members interested in the HTAP Information System development, implementation and use. The contents can be freely edited by anyone who is logged in. Comments and feedback can also be sent by email to firstname.lastname@example.org. The content of this page is is on the Information System that supposts Chapter 6. The subsections are action-oriented. A draft HTAP Report Chap. 6 - Jan 07 Outline is also on this wiki.
The purpose of Chapter 6 is to discuss the need to integrate information from observations, models, and emissions inventories to better understand intercontinental transport. This section focuses on the information system in support of the integration of observations, emissions and models for HTAP.
Recent developments in air quality monitoring, modeling and information technologies offer outstanding opportunities to fulfill the information needs for the HTAP integration effort. The data from surface-based air pollution monitoring networks now routinely provide spatio-temporal and chemical patterns of ozone and PM. Satellite sensors with global coverage and kilometer-scale spatial resolution now provide real-time snapshots which depict the pattern of industrial haze, smoke, dust, as well as some gaseous species in stunning detail. Detailed phisico-chemical models are now capable of simulating the spatial-temporal pollutant pattern on regional and global scales. The ‘terabytes’ of data from observations and models can now be stored, processed and delivered in near-real time. The instantaneous ‘horizontal’ diffusion of information via the Internet now permits, in principle, the delivery of the right information to the right people at the right place and time. Standardized computer-computer communication languages and Service-Oriented Architectures (SOA) now facilitate the flexible access, quality assurance and processing of raw data into high-grade ‘actionable’ knowledge suitable for HTAP policy decisions. Last but not least, the World Wide Web has opened the way to generous sharing of data and tools leading to collaborative analysis and virtual workgroups. Nevertheless, air quality data analyses and data model integration face significant hurdles. The section below present an architectural framework and implementation strategy for the proposed HTAP information system.
HTAP Information System
The term architecture here refers to both function and form of the HTAP Information System (HTAP IS). The key function of the HTAP IS is to provide information technology support to the researches performing their task while seeking to better understand transcontinental air pollution transport.
It is a tools set that is to empower the participating and collaborating analysts. The HTAP IS will not compete with existing tools of its members. Rather it will embrace and leverage those resources through an open, collaborative federation philosophy. Its main contribution is to connect the TF participating analysts and their information resources
The information system HTAP IS consists of the following:
- Provide tools and methods for data access, processing and integration. This will be performed by the IT of the HTAP Federated Data System.
- Facilitate open communication between the collaborating analysts. The IT technologies include wikis and other groupware, social software, blog, Skype, etc..
- Provide a shared workspace and where the above activities take place
Homogenize the data into an interoperable dataset
- facilitate access to the distributed data
- ensure the seamless flow of data through interoperable interfaces
- provide a set of basic tools for data exploration and processing
The primary goal of the data system is to allow the flow and integration of observational, emission and model data. The model evaluation requires that both the observations and if possible the emissions are fixed. For this reason it is desirable to prepare an integrated observational database to which the various model implementations can be compared to. On one hand, the integrated data set should be as inclusive as possible. On the other hand, such a goal needs to be tempered by many limitations that preclude a broad, inclusive data integration
The proposed HTAP data system will be a hybrid combination of both distributed and fixed components as illustrated schematically in Figure ??. Both the data providers as well as the HTAP analysts-users will be distributed. However, they will be connected through an integrated HTAP database which should be a stable, virtually fixed database. The section below describes the main component of this distributed data system.
The primary function of the HTAP IDS is to facilitate the creation of an integrated data set suitable for model evaluation and pollutant characterization.
The multiple steps that are required for this functionality are shown on the left. The sequence of operations can be viewed as a value chain that transform the raw input data into a highly organized integrated data set. These value adding steps have to be performed for each candidate data set.
The operations that prepare the integrated data set can be viewed as services that sequentially operate on the data stream. Each of the services with its clearly defined functionality and firmly specified service interface. In principle, the standards-based interface allows a connection of service chains using formal work flow software. The main benefit of Service Oriented Architecture (SOA) is that allows the building of agile application programs that can respond to changes in the data input conditions as well as the output requirements.
The service oriented software architecture is an implementation of the System of Systems approach, which is the design philosophy of GEOSS. Each service can be operated by autonomous providers and the "system" that implements the service is behind the service interface combining the independent services constitutes System of Systems. In other words, following the SOA approach, not only the data providers are distributed but also the processing services as well. This flexible approach to distributed computing allows the distribution of labor (chain of services) in many different configurations.
For instance, wrapping the existing data with standards based interfaces for data access can be performed for an entire cluster of data. This is the approach taken in the federated data system DataFed. Given a standard interface to a variety of data sets, the Quality Assurance service can be performed by another service provider that is seamlessly connected to the data access mediator. Similarly, the service that prepares a data set for data integration can be provided by another service in the data flow network. This flexibility offered through the chaining and orchestration of distributed, loosely coupled web services is the architectural support for the building of agile data systems for the support of future demanding applications.
Interoperability Standards and Data Wrapping
The methods and tools for model inter comparisons was a subject of a productive workshop at GRC Ispra in March 2006. European and American members of the HTAP TF presented their respective approaches as part of the model and data intercoperisons as part of their respective projects ENSEMBLES, Eurodelta, ACCENT, AEROCOM, GEMS, and DataFed. Several recommendations were made to improve future use of intercomparison data. The most important recommendation was to agree on a common data format, NetCDF CF conventions, which would allow a variety of tools to work with the data.
Interoperable data access will be accomplished through the use of a suite of international standards. The naming of individual chemical parameters will follow the CF convention used by the Climate and Forecast (CF) communities. The existing names for atmospheric chemicals in the CF convention were inadequate to accommodate all the parameters used in the HTAP modeling. in order to remedy this shortcoming the list of standard names was extended by the HTAP community under leadership of C. Textor. She also became a member of the CF convention board that is the custodian of the standard names. The standard names for HTAP models were developed using a collaborative wiki workspace.
The data transfer from providers to users also benefit from standard data formats. For modeling data the use of netCDF-CF as a standard format is recommended. The use of a standard physical data format and the CF naming conventions allows, in principle, the seamless connection between data provider and consumer services. It should be noted, however, that at this time the CF naming convention has only been developed for the model parameters and not for the various observational parameters. Also, the nedCDF CF is primarily used as a model data exchange format, while the transfer of surface monitoring data and satellite data is less standardized.
The third aspect of data interoperability is a standard query language through which user services request specific data from the provider services. It is proposed that for the HTAP data information system adapts the Web Coverage Service (WCS) as the standard data query language. The WCS data access protocol is defined by the international Open Geospatial Consortium (OGC), which is also the key organization responsible for interoperability standards in GEOSS. Since the WCS protocol was originally developed for the GIS community, it was necessary to adapt it to the needs of "fluid Earth sciences". Members of the HTAP group have been actively participating in the development and testing of the WCS interoperability standards.
Standards-based data access can be accomplished by ‘wrapping’ the heterogeneous data into standardized web services, callable through well-defined Internet protocols. The result of this ‘wrapping’ process is an array of homogeneous, virtual datasets that can be accessed through a standard query language and the returned data are packaged in standard format, directly usable by the consuming services.
This is just a place holder. Here we need to explain the quality assurance steps that are needed to prepare the HTAP Integrated Dataset. For true quality assurance and for data homogenization the data flow channels for individual data sets need to be evaluated in the context of other data.
Homogenization and Integration
The HTAP Integrated Dataset (HID) will be used to compare models to observations. It will be created from the proper combination of a variety of surface, upper air and satellite observations. However, before the inclusion into HID, each dataset will need to be scrutinized to make it suitable for model comparison. The scrutiny may include filtering, aggregation and possibly fusion operations.
A good example is the 400-station AIRNOW network reporting hourly ozone and PM2.5 concentrations over the US. The AIRNOW network includes both urban sites that are strongly influenced by local sources. Urban stations need to be removed from HID dataset.
The Service Oriented Architecture (SOA) allows connecting the web service components (e.g. services for data access, transformation, fusion, rendering, etc.) using work flow software.
Developing the specific criteria and procedures for the HTAP integrated dataset will require the attention of a HTAP subgroup.
Data Selection Criteria
Monitoring data for atmospheric constituents are now available from a variety of sources, not all of which are suitable for the integrated HTAP data set. Given the limited scope and resources of HTAP, it will be necessary to select a subset of the available observational data that would be most appropriate for the preparation of the 2009 assessment. A set of criteria for the data selection is given below.
- The suitability criteria may include the measured parameters, their spatial extent and coverage density, as well as the time range and sampling frequency with major emphasis on data quality (defined as???).
- The compilation of the global integrated data sets should initially focus on PM and ozone, including the gaseous precursors.
- Subsequent data collection should also include observations on atmospheric mercury and persistent organic pollutants (POPS).
- In order to cover hemispheric transport of air pollutants the data set system should accept and utilize data from outside the geographical scope of EMEP.
- The data gathering should begin the compilation using the of existing databases in Europe (RETRO, TRADEOFF, QUANTIFY, CRATE, AEROCOM) and virtual federated systems in the US (DataFed, Giovanni and others).
- TF data integration should also contribute to and benefit from other ongoing data integration efforts to integrate the global data resources, most notably, with ACCENT project in Europe, and similar efforts in the US.
- Special emphasis should be placed on the collection of suitable vertical profiles from aircraft measurements as well as from surface and space-borne lidars.
The evaluation of suitable observational data sets for model validation and fusion will require close interaction between the modeling and observational communities. An wiki workspace for open collaborative evaluation of different datasets is desirable.
Integrated Data Set
The providers of observational, emission, and modeling data are distributed. The individual data sets will be funneled into the integrated HTAP data set. The data transfer will be accomplished using standard data transfer protocols described above. Transferring data from the providers to users will include two operations for each data set: quality assurance and semantic transformation/integration.
In the next, upcoming phase of HTAP the existing modeling and observing systems will be integrated to yield a deeper and more robust understanding of hemispheric transport. Both the modeling of the chemical constituents as well as the Earth observations used to document the hemispheric transport are currently being performed by individual projects and programs in the US and Europe. These constitute autonomous systems with well defined purpose and functionality. The key role of the Task Force is to assess the contributions of the individual systems and to integrate those into a System of Systems. It is believed that the GEO System of Systems architecture is an attractive framework for HTAP integration.
HTAP Information Network
- Federated Data System DataFed
- NASA Data System Giovanni
- Emission Data System NEISGEI
- Juelich Data System
Data Networks. Connected hubs; Show Core HTAP network
HTAP Data Sets - need list of others
See 20 selected datasets in the federated data systems, DataFed
- TOMS_AI_G - Satellite
- SURF_MET - Surface
- SEAW_US - Satellite
- SCIAMACHYm - Satellite
- RETRO ANTHRO - Emission
- OnEarth JPL - Satellite
- OMId - Satellite
- NAAPS GLOBAL - Model
- MOPITT Day - Satellite
- MODISd G - Satellite
- MODIS Global Fire - Satellite
- MISRm G - Satellite
- GOMEm G - Satellite
- GOCART G OL - Model
- EDGAR - Emission
- CALIPSO - Satellite
- AIRNOW - Surface
- VIEWS OL - Surface
- AERONETd - Surface
- AEROCOM LOA - Model
Reconciliation and Integration of Observations, Emissions and Models
Compare observations to models and emissions
Characterize pollutant pattern and SSR
Develop real-time data assimilation
Perform reanalysis with assimilated data
HTAP Relationship to GEO and GEOSS
There is an outstanding opportunity to develop a mutually beneficial and supportive relationship between the activities of the HTAP Task Force and that of GEO. The Group of Earth Observations with its broad range of national and organizational members has developed a general architectural framework for turning Earth observations into societal benefits. The three main components of this architecture are models, and observations, which then feed into decision support systems used in a variety of societal decision making. This general GEO framework is well suited as an architectural guide to the HTAP program. However, this framework lacks specific guidelines and implementation details that are needed for practical Earth observing and decision support systems.
The HTAP program provides an opportunity to extend the GEO framework features that arise in specific HTAP application context. In case of HTAP, the major activities are shown in the architectural diagram of Figure ?? The HTAP modeling is conducted through global scale chemical transport models that simulate or forecast the atmospheric chemical composition as a consequence of natural and human induced emissions. The observations arise from satellite, ground-based and airborne observations of chemical constituents and their hemispheric transport. The decision support system consists of the scientists as members of the HTAP task force, and the task force co-chairs as the intermediaries between LRTP and the TF. (Terry, Andre this description of the HTAP DSS needs your help).
Representing the Decision Support System (DSS) for HTAP is an important aspect because it can guide the design and and implementation of the information system for the support of the decision processes. A higher resolution DSS design document is also beneficial as a communications channel for the interacting system of systems components. A domain-specific DSS may also allow the generalization and applicability to the design of DSS structures for similar application domains.
The implementation of the GEO framework utilizes the concept of Global Observing System of Systems (GEOSS). Traditionally, Earth observations were performed by well defined systems such as specific satellites and monitoring networks which were designed and operated using systems engineering principles. However, GEO recognized that the understanding of Earth system requires the utilization and integration of the individual, autonomous systems. The key difference between the Systems and System of System approaches are highlighted in Table 1.
The HTAP program can also be considered an early demonstration of the GEO concepts through its end to end approach. Both the atmospheric modeling, as well as the observations currently exist through the operation of existing modeling and observation systems. The Task Force has agreed organizing and evaluating those models, assembling and integrating the observational data sets and then reconciling the models with the observations will be the focus of the Phase II effort. The Task Force will also prepare and deliver a report to LRTP to aid its deliberations and its decision making processes. This sequence of activities constitutes an end to end approach that turns observations and models into actionable knowledge for societal decision making. One could say that this is an octagonal approach to more deliberate step by step development of GEOSS where in Phase I interoperability, in Phase II ???
HTAP Relationship to IGACO
The HTAP program can also have a mutually beneficial interaction with the Integrated Global Atmospheric Chemistry Observations (IGACO) program. The IGACO is part of the Integrated Global Observing Strategy (IGOS). IGACO proposes a specific framework for combining observational data and models. It also specifies the higher level information products that can be used for creating social benefit.
In the IGACO framework, the data products from satellite, surface, and aircraft observations are collected for inclusion into an integrated data archive. A key component in the IGACO data flow is the mandatory quality assurance and QA QC that precedes the archiving. This is particularly important for HTAP where multi-sensory data from many different providers are to be combined into an integrated database. In the IGACO framework, a key output from the data system is a high integrated spatial-temporal data set which can be used in a multitude of applications that produce socal benefit. The goal of creating such an integrated data set is shared by the IGACO and the HTAP programs (?? Len, this section could benefit from your input)
HTAP Relationship with Other Programs