Difference between revisions of "Question Three"

From Earth Science Information Partners (ESIP)
Line 6: Line 6:
 
* "Brand" products (through time)
 
* "Brand" products (through time)
 
* Develop registry of trusted entities. (assessing quality of watermarkers)
 
* Develop registry of trusted entities. (assessing quality of watermarkers)
 +
 +
--------------------------------------
 +
 +
* Use appropriate metadata standards that include quality data
 +
* Embed links in metadata to community-supported quality information (see questions [[http://wiki.esipfed.org/index.php/Question_Eight 8]] and [[http://wiki.esipfed.org/index.php/Question_Nine  9]])
 +
* Include ability to find citations to data in publications to illustrate uses
 +
* Create tools to compare basic attributes of comparable datasets (see question [[http://wiki.esipfed.org/index.php/Question_Eleven  11]]). I.E. C/NET product comparisons
 +
* Keep [all] documentation with data. Evolve data models to include all relevant metadata
 +
* Invest in other quality and impact metrics. For example, scholarly citation metrics
  
 
--------------------------------------
 
--------------------------------------

Revision as of 15:37, January 7, 2009

Enter your discussion report-out below:

3. How can data providers make it easier to assess the data quality and the appropriate uses for a data set?

  • Use watermarks as a "stamp of approval"
  • "Brand" products (through time)
  • Develop registry of trusted entities. (assessing quality of watermarkers)

  • Use appropriate metadata standards that include quality data
  • Embed links in metadata to community-supported quality information (see questions [8] and [9])
  • Include ability to find citations to data in publications to illustrate uses
  • Create tools to compare basic attributes of comparable datasets (see question [11]). I.E. C/NET product comparisons
  • Keep [all] documentation with data. Evolve data models to include all relevant metadata
  • Invest in other quality and impact metrics. For example, scholarly citation metrics