Difference between revisions of "June 7, 2011"

From Earth Science Information Partners (ESIP)
 
(7 intermediate revisions by the same user not shown)
Line 18: Line 18:
 
* Greg Stensaas (Chair of the CEOS Cal/Val WG): "International Earth Observation Quality Assurance Efforts".
 
* Greg Stensaas (Chair of the CEOS Cal/Val WG): "International Earth Observation Quality Assurance Efforts".
 
* Preparation for the Summer EIP Meeting in Santa Fe: [http://wiki.esipfed.org/index.php/Summer_2011_Meeting ESIP Summer 2011 Meeting]
 
* Preparation for the Summer EIP Meeting in Santa Fe: [http://wiki.esipfed.org/index.php/Summer_2011_Meeting ESIP Summer 2011 Meeting]
 +
*One of the objectives is to prepare something for qa4eo
 
** Speakers  
 
** Speakers  
 
** Identify reps of various research and application communities for:  
 
** Identify reps of various research and application communities for:  
Line 45: Line 46:
 
::*better definitions for standards, processes and establish quality indicators
 
::*better definitions for standards, processes and establish quality indicators
 
:*Need coordination among MANY players in the quality space
 
:*Need coordination among MANY players in the quality space
:*CEOS
+
:*CEOS - [http://www.ceos.org/index.php?option=com_content&view=category&layout=blog&id=31&Itemid=74 Organizational structure and its players]. Ceos is a worldwide test site for referencing data through Earth Explorer. 
 +
::*[http://wgiss.ceos.org/lsip/ LSI] - CEOS Land Surface Imaging Constellation Portal for Mid-Resolution Optical LSI Satellite System Information and Enhanced Data Access
 
::*GEO SBA linkages (CEOS is the space arm of GEO) - (http://earthobservations.org)
 
::*GEO SBA linkages (CEOS is the space arm of GEO) - (http://earthobservations.org)
 +
:::* http://www.gmes.info/ - puts out quality requirements
 +
:::*http://www.geoviqua.org/ - GeoViQua - QUAlity aware VIsualisation for the Global Earth Observation system of systems
 
::*WG on Calibration/Validation (WGCV) - has 6 subgroups
 
::*WG on Calibration/Validation (WGCV) - has 6 subgroups
 
::*Cal/Val portal (http://calvalportal.ceos.org)
 
::*Cal/Val portal (http://calvalportal.ceos.org)
 
::*test site (http://earthexplorer.usgs.gov)
 
::*test site (http://earthexplorer.usgs.gov)
::*[http://www.ceos-cove.org/Login.aspx CEOS] Visualization Environment (COVE)
+
::*[http://www.ceos-cove.org/Login.aspx CEOS] Visualization Environment (COVE) - built by systems engineering office buil - useful for calibration and validation predicting when things will be available over certain area.
:*[http://www.epa.gov/geoss/ GEOSS] support for quality information for SBAs (timely, quality, long-term & global)
+
:*[http://www.epa.gov/geoss/ GEOSS] support for quality information for SBAs (timely, quality, long-term & global)-needed good long term record, started to figure out how to better facilitate interoperability and harmonization of data.
 +
::*GEOSS: [http://www.grouponearthobservations.org/cdb/ts.php?id=54 TASK-DA-09-01a] - GEOSS Quality Assurance Strategy
 
::*Quality Assurance Framework for Earth Observations (QA4EO) http://qa4eo.org
 
::*Quality Assurance Framework for Earth Observations (QA4EO) http://qa4eo.org
 
::*Reference standards can be test sites or a reference to a standard ("fit for purpose")
 
::*Reference standards can be test sites or a reference to a standard ("fit for purpose")
 
::*Definition has to be relevant to the whole suite of observations
 
::*Definition has to be relevant to the whole suite of observations
 
::*October 18-20, 2011 QA4EO Workshop in Oxford, England (4th meeting) - "QA4EO Workshop on Providing Harmonised Quality Information in Earth Observation Data by 2015"
 
::*October 18-20, 2011 QA4EO Workshop in Oxford, England (4th meeting) - "QA4EO Workshop on Providing Harmonised Quality Information in Earth Observation Data by 2015"
::*Joint Agency Commercial Imagery Evaluation (JACIE) - annual
 
 
::*[http://calval.cr.usgs.gov/news/joint-agency-commercial-imagery-evaluation-jacie-workshop-agenda/ Joint Agency Commercial Imagery Evaluation (JACIE) Workshop] Boulder, Colorado, March 29-31, 2011 - an anual workshop that had 10 successful meetings already. Does a lot of work related to quality of data.
 
::*[http://calval.cr.usgs.gov/news/joint-agency-commercial-imagery-evaluation-jacie-workshop-agenda/ Joint Agency Commercial Imagery Evaluation (JACIE) Workshop] Boulder, Colorado, March 29-31, 2011 - an anual workshop that had 10 successful meetings already. Does a lot of work related to quality of data.
 
::*ESA GMES quality assurance requirements
 
::*ESA GMES quality assurance requirements
Line 64: Line 68:
 
:*With multiple approaches for calibration & validation, how is th (Note: Greg L. - what was your question?)
 
:*With multiple approaches for calibration & validation, how is th (Note: Greg L. - what was your question?)
 
::*calibration of instruments must be documented to international standards; still learning how to do this with, especially for user definitions of quality
 
::*calibration of instruments must be documented to international standards; still learning how to do this with, especially for user definitions of quality
 +
::*2 dist. methods
 +
:::*start with error uncertainty that are determined during calibration and propagate through retrieval algorithms and derive uncertainty of the geophysical model.
 +
:::*black box - at the end of the day validate the data against certain truths and validate against biases and errors.
 +
 +
Greg Stensaas perspective:
 +
:very important to do prelaunch well documented calibration, calibration to international standards.
 +
:Important to document recalibration.
 +
:Validation component: black box process - the only way is to have ground truths
 +
havent found good examples of errors affiliated with processing steps.
 +
 +
Gregory Leptoukh:
 +
:when we go from lvl1 to lvl2 standardisation is questionable
 +
:havent seen good quality indicators of processing algorithms, also no quality indicators of how well the data is delivered to users
 +
:most standards I see have a geo in it's name do you know if there are any  quality standards that are not geographical?
 +
 +
Greg Stensaas:
 +
Initial intent was geo because most data are geolocated, related to a grid, but it could be related to the entire instrument. No need to worry about the word.
 +
 
::*validation
 
::*validation
 
:*ISO standard use of "Geographic" language, why?
 
:*ISO standard use of "Geographic" language, why?

Latest revision as of 00:27, June 14, 2011

Participants

  • Gregory Leptoukh (NASA GSFC)
  • Greg Stensaas (USGS)
  • Oleg Aulov (UMBC)
  • Carol Meyer (ESIP)
  • Erin Robinson (ESIP)
  • Gary Foley (EPA)
  • Tyler Stevens (NASA GSFC)
  • Ed Armstrong (JPL)
  • Joanne Nightingale (NASA GSFC)
  • Phil Jones (NOAA NCDC)
  • Karen Moe (NASA GSFC)
  • Barry Weiss (JPL)
  • Fritz VanWijngaarden (Northrop Grumman)
  • W. Han (?)

Agenda

  • Greg Stensaas (Chair of the CEOS Cal/Val WG): "International Earth Observation Quality Assurance Efforts".
  • Preparation for the Summer EIP Meeting in Santa Fe: ESIP Summer 2011 Meeting
  • One of the objectives is to prepare something for qa4eo
    • Speakers
    • Identify reps of various research and application communities for:
      • their understanding and requirements for quality
      • methodology they use to assess and quantify data quality
      • Communities: SST, Ocean color, Precipitation, Atm Chemistry, Land, Modeling, Applications (e.g., Air Quality),...

Old Action Items

  • Hook and Greg: setup a page for collecting and aliasing quality-related terms

Notes

Upload community-specific presentations/papers either to the Community portion of the web site

  • Greg Leptoukh noted that the agenda for the summer meeting breakout session is still fluid.
  • Ocean color community not able to attend
  • Invitations to NASA data quality PIs will be issued
  • AQ and Energy communities will be represented
  • Greg Stensaas, USGS & Chair of CEOS Calibration/Validation WG
  • Landsat program repsonsibilities shifting
  • continually trying to improve data accuracy
  • need to improve time series terrestrial data
  • integration of Landsat data with other data and sensors to support climate change research
  • Need calibration to improve accuracy across systems
  • requires better documentation
  • improves consistency across data sets
  • GCOS - improved definitions for Accuracy & Stability
  • better definitions for standards, processes and establish quality indicators
  • LSI - CEOS Land Surface Imaging Constellation Portal for Mid-Resolution Optical LSI Satellite System Information and Enhanced Data Access
  • GEO SBA linkages (CEOS is the space arm of GEO) - (http://earthobservations.org)
  • WG on Calibration/Validation (WGCV) - has 6 subgroups
  • Cal/Val portal (http://calvalportal.ceos.org)
  • test site (http://earthexplorer.usgs.gov)
  • CEOS Visualization Environment (COVE) - built by systems engineering office buil - useful for calibration and validation predicting when things will be available over certain area.
  • GEOSS support for quality information for SBAs (timely, quality, long-term & global)-needed good long term record, started to figure out how to better facilitate interoperability and harmonization of data.
  • GEOSS: TASK-DA-09-01a - GEOSS Quality Assurance Strategy
  • Quality Assurance Framework for Earth Observations (QA4EO) http://qa4eo.org
  • Reference standards can be test sites or a reference to a standard ("fit for purpose")
  • Definition has to be relevant to the whole suite of observations
  • October 18-20, 2011 QA4EO Workshop in Oxford, England (4th meeting) - "QA4EO Workshop on Providing Harmonised Quality Information in Earth Observation Data by 2015"
  • Joint Agency Commercial Imagery Evaluation (JACIE) Workshop Boulder, Colorado, March 29-31, 2011 - an anual workshop that had 10 successful meetings already. Does a lot of work related to quality of data.
  • ESA GMES quality assurance requirements
  • Preceding provides an overview of many of the quality assurance efforts going on globally
  • important to work collaboratively across the many players
  • Q&A for Greg Stensaas
  • With multiple approaches for calibration & validation, how is th (Note: Greg L. - what was your question?)
  • calibration of instruments must be documented to international standards; still learning how to do this with, especially for user definitions of quality
  • 2 dist. methods
  • start with error uncertainty that are determined during calibration and propagate through retrieval algorithms and derive uncertainty of the geophysical model.
  • black box - at the end of the day validate the data against certain truths and validate against biases and errors.

Greg Stensaas perspective:

very important to do prelaunch well documented calibration, calibration to international standards.
Important to document recalibration.
Validation component: black box process - the only way is to have ground truths

havent found good examples of errors affiliated with processing steps.

Gregory Leptoukh:

when we go from lvl1 to lvl2 standardisation is questionable
havent seen good quality indicators of processing algorithms, also no quality indicators of how well the data is delivered to users
most standards I see have a geo in it's name do you know if there are any quality standards that are not geographical?

Greg Stensaas: Initial intent was geo because most data are geolocated, related to a grid, but it could be related to the entire instrument. No need to worry about the word.

  • validation
  • ISO standard use of "Geographic" language, why?
  • can relate it to both a spatial or temporal extent (either or both) - don't worry about the word
  • ISO/NP TS 19159 Geographic information - Calibration and validation of remote sensing imagery sensors and data
  • For the Workshop looking for support - strategic implementation team for GEO
  • ISO/CD 19157 Geographic information -- Data quality
  • With regard to QA4EO effort, how can ESIP support the task?
  • Need to show value of the development of quality indicators (e.g. Air Quality)
  • UK workshop will feature showcases
  • ISO Data Quality - processes
  • From the point of view of users, knowing total quality assurance from instrument to data product is essential (Foley)
  • Decision makers need some certainty that they data they're using is of high enough "quality" in order to be able to use it
  • Actions for Follow Up:
  • Link to Air Quality Working Group (Meyer/Robinson) - done
  • Obtain EPA Data Quality Documentation (Foley/Stensaas)
  • Invitations to CEOS Cal/Val WG subgroups (Meyer/Leptoukh/Stensaas/Nightingale)
  • Get slides from Greg Stensaas (Leptoukh/staff)

Action Items

  • All: continue looking for speakers and community reps