Talk:White Paper on Systems Assessment

From Earth Science Information Partners (ESIP)
Revision as of 15:30, May 13, 2008 by Erinmr (talk | contribs) (Comment added for Steve Young)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Comments on Draft assessment white paper -- Mike Gilroy (Mjgrota) 12:35, 16 April 2008 (EDT)

David, this is great. Thanks

I'll recommend that you add some key elements to the goals and assessments questions.

1. limit this phase of the assessment to EPA core systems. Do not assess external systems at this time. I say this as I feel it is imperitative for EPA to look at it's own house first, do some cleaning and polishing and then move forward to assess external tools that have proven to be useful(Views) and those that we (AQ community) have limited exposure (NASA, NOAA systems). I particularly appreciate the focus on eliminating redundancy (waste). It is important to have contingency built into the data systems but clarify it's differernce from redndancy.

2. In the goals we should be very clear that the assessmnet will look at the EPA internal silos that exisit and which may be impediments to stream lining data flow and access. This is vital if EPA (Chet) is going to be able to make resource decisions resulting from the process.

3. Consider adding a question that will spcifically address the internal silo assessment.

4. It is important that the assemement will lead to further development of tangible milestones and decision making. I recommend that we determine that each element of the assessment will accomplish that objective. If not strike it.

Comments from later half of 4/16 discussion -- Louis Sweeny (Louis) 15:58, 16 April 2008 (EDT)

Comments on assessment white paper:

Steve: could we identify the high level (aspirational) functions of the system and map systems to how they do/don't meet them.

Starting point for high level functions

We want to identify areas where there are redundancies and characteize them, especially where there are possible cost savings.

==Some possible pro of redundancy==:

  • system stability/redundancy (e.g. funding drops lead to gaps, NEDP example?)
  • some system specific unique value
  • competition driving innovation
  • efficiency due to redundancy, simpler to copy data than work out complex federation (e.g. systematic warehousing vs unmanaged copies)
  • scientific redundancy to support validation
  • to provide access to data that would not otherwise be available to a group
  • one version of the truth, end up with different answers to the same question ..idea of official data warehouse

--what is redundancy and what is for AQS what is the acceptable downtime? that would be different if airnowtech went down. think about AirnowTech: viewed as a discretionary resource...its not the AQS compliance database what would happen if it got cut. we don't go to far into detail, we just say what would the consequence of a given system going away. Include a paragraph of perceived consequences.

we need to think about this NOT from the IT standpoint only but from the organizational/institutional aspect, not just how IT system is set up re functional redundancy, but the responsibility and funding structure. Strategy of spreading it out so that if one org loses funding it does not go away...within agency or across them. GAW(sp) system.

example - RAP/VIEWs...what happens if that goes away...if people build treat it as a core..what if NOAA is depended upon and then cuts funding.

--from the systems in the list, how stable is the funding, what would impact of loosing it be, what are the risks to this system.

---do we need to a risk assessment, of the identified core assets, and assign some values to them, to the key products.

---look at the parts, in some places we want redundancy (e.g. visualization) for other functions we don't want redundancy

==Negative aspects of redundancy==: -resources that could be better spent -resource starving, e.g. three people doing something with 1/3 of money each and doing it poorly because

Comments by Sig on the Mintz Draft White Paper on Systems Assessment -- Sig Christensen (Sig) 18:08, 16 April 2008 (EDT)

These comments apply to the list at the end of the Mintz draft version last modified 20:08, 16 April 2008

This white paper is helpful in guiding the next step. Here are a few suggestions.
Item 2 - Sustainability has nuances. Data developed and archived might persist regardless of continued funding, while a system invoked via an interactive Web site might not persist long if funding ceased. Besides identifying factors favoring or working against a system's sustainability, the assessment may want to consider the consequences of its not being sustained. Aspects of this also relate to Item 1.
Item 3 - This (and some other questions) focus on the system or data user. The other side of the coin is the data provider. What can or should be done accommodate users who need specialized data from providers that are not "hooked into" any of the existing systems and whose data are not available?
Item 6 - This question should be focused/defined a bit. Is the question really intended only at "the systems that provide primary data intake and archiving?" By "data standards" do you mean what metadata are collected? Variable nomenclature? QA requirements? Other items, and all of the above? Perhaps a list of the main elements of "data standards" should be developed (or identified elsewhere) and linked to this question.
I hope these are helpful.
Sig Christensen (Sig) 18:08, 16 April 2008 (EDT)

External Drivers for Assessment? -- Steve Young (SteveYoung) 17:28, 13 May 2008 (EDT)

I'm going to throw a thought over the transom about the assessment white paper. It strikes me that the paper doesn't seem to mention external drivers that are additional good reasons for the assessment, complementary to the data summit itself. Two such drivers that come to mind are the 2004 NAS report "Air Quality Management in the United States" (PPT overview)and the air quality assessment and forecasting "near term opportunity" discussions in the April 2005 Strategic Plan for the Integrated Earth Observation System and the related September 2006 "Air Quality Assessment and Forecast System: Near-Term Opportunity Plan." For that last piece, see:

It's mostly a matter of taking note of these and perhaps some other drivers and including them in the assessment process...I don't expect it will cause any radical changes in direction but could help make the case. Cheers, Steve