Difference between revisions of "Talk:White Paper on Systems Assessment"

From Earth Science Information Partners (ESIP)
(Comments by Sig on the Mintz Draft White Paper on Systems Assessment -- ~~~~)
Line 36: Line 36:
 
-resources that could be better spent
 
-resources that could be better spent
 
-resource starving, e.g. three people doing something with 1/3 of money each and doing it poorly because
 
-resource starving, e.g. three people doing something with 1/3 of money each and doing it poorly because
 +
 +
== Comments by Sig on the Mintz Draft White Paper on Systems Assessment -- [[User:Sig|Sig]] 18:08, 16 April 2008 (EDT) ==
 +
 +
These comments apply to the list at the end of the Mintz draft version last modified 20:08, 16 April 2008
 +
:This white paper is helpful in guiding the next step.  Here are a few suggestions. 
 +
:Item 2 - Sustainability has nuances.  '''Data''' developed and archived might persist regardless of continued funding, while a '''system''' invoked via an interactive Web site might not persist long if funding ceased.  Besides identifying factors favoring or working against a system's sustainability, the assessment may want to consider the consequences of its not being sustained.  Aspects of this also relate to Item 1.
 +
:Item 3 - This (and some other questions) focus on the system or data user.  The other side of the coin is the data provider.  What can or should be done accommodate users who need specialized data from providers that are not "hooked into" any of the existing systems and whose data are not available?
 +
:Item 6 - This question should be focused/defined a bit.  Is the question really intended only at "the systems that provide primary data intake and archiving?"  By "data standards" do you mean what metadata are collected?  Variable nomenclature?  QA requirements?  Other items, and all of the above?  Perhaps a list of the main elements of "data standards" should be developed (or identified elsewhere) and linked to this question.
 +
:I hope these are helpful.
 +
[[User:Sig|Sig]] 18:08, 16 April 2008 (EDT)

Revision as of 16:08, April 16, 2008

Comments on Draft assessment white paper -- Mike Gilroy (Mjgrota) 12:35, 16 April 2008 (EDT)

David, this is great. Thanks

I'll recommend that you add some key elements to the goals and assessments questions.

1. limit this phase of the assessment to EPA core systems. Do not assess external systems at this time. I say this as I feel it is imperitative for EPA to look at it's own house first, do some cleaning and polishing and then move forward to assess external tools that have proven to be useful(Views) and those that we (AQ community) have limited exposure (NASA, NOAA systems). I particularly appreciate the focus on eliminating redundancy (waste). It is important to have contingency built into the data systems but clarify it's differernce from redndancy.

2. In the goals we should be very clear that the assessmnet will look at the EPA internal silos that exisit and which may be impediments to stream lining data flow and access. This is vital if EPA (Chet) is going to be able to make resource decisions resulting from the process.

3. Consider adding a question that will spcifically address the internal silo assessment.

4. It is important that the assemement will lead to further development of tangible milestones and decision making. I recommend that we determine that each element of the assessment will accomplish that objective. If not strike it.

Comments from later half of 4/16 discussion -- Louis Sweeny (Louis) 15:58, 16 April 2008 (EDT)

Comments on assessment white paper:

Steve: could we identify the high level (aspirational) functions of the system and map systems to how they do/don't meet them.

Starting point for high level functions

We want to identify areas where there are redundancies and characteize them, especially where there are possible cost savings.

==Some possible pro of redundancy==:

  • system stability/redundancy (e.g. funding drops lead to gaps, NEDP example?)
  • competition driving innovation
  • efficiency due to redundancy, simpler to copy data than work out complex federation (e.g. systematic warehousing vs unmanaged copies)
  • scientific redundacy to support validation


==Negative aspects of redundancy==: -resources that could be better spent -resource starving, e.g. three people doing something with 1/3 of money each and doing it poorly because

Comments by Sig on the Mintz Draft White Paper on Systems Assessment -- Sig Christensen (Sig) 18:08, 16 April 2008 (EDT)

These comments apply to the list at the end of the Mintz draft version last modified 20:08, 16 April 2008

This white paper is helpful in guiding the next step. Here are a few suggestions.
Item 2 - Sustainability has nuances. Data developed and archived might persist regardless of continued funding, while a system invoked via an interactive Web site might not persist long if funding ceased. Besides identifying factors favoring or working against a system's sustainability, the assessment may want to consider the consequences of its not being sustained. Aspects of this also relate to Item 1.
Item 3 - This (and some other questions) focus on the system or data user. The other side of the coin is the data provider. What can or should be done accommodate users who need specialized data from providers that are not "hooked into" any of the existing systems and whose data are not available?
Item 6 - This question should be focused/defined a bit. Is the question really intended only at "the systems that provide primary data intake and archiving?" By "data standards" do you mean what metadata are collected? Variable nomenclature? QA requirements? Other items, and all of the above? Perhaps a list of the main elements of "data standards" should be developed (or identified elsewhere) and linked to this question.
I hope these are helpful.
Sig Christensen (Sig) 18:08, 16 April 2008 (EDT)