2012 Winter Meeting Poster Session

From Earth Science Information Partners (ESIP)

Posters are listed alphabetically, by submitter last name. Demos and Funding Friday 2011 winners are noted in the titles.

Human Sensor Networks: Use of Social Media and Self Organizing Maps for Automated Detection of Oil Spill Plumes in Satellite Observations (Funding Friday 2011)

Oleg Aulov, Milton Halem, David Lary

Water Horizon oil spill in the Gulf of Mexico in April 2010--can save lives, prevent property damage and help minimize environmental impact. During oil spill disasters, trained satellite analysts at NOAA/NESDIS process satellite observations and manually integrate data from numerous sources to produce a polygonal map that identifies the locations of possible detected oil on the surface of the ocean. These polygon maps are assimilated into an operational Lagrangian trajectory model driven by wind and ocean current data to forecast the movement of the oil. We demonstrate an automated algorithm to detect and map surface oil distrbutions from satellite observations. We employ a Self Organizing Map (SOM) machine learning algorithm. A SOM algorithm is a type of an unsupervised neural network that produces a low-dimensional representation of a higher dimensional input space while preserving its topological properties. This low order representation is called a map. Once the map is created we use social media data from human sensor networks together with other ground observations to determine which cluster represents the oil plume.

We demonstrate an automated approach to analyzing satellite imagery for oil plume detection. We use an unsupervised machine learning algorithm called Self Organizing Maps (SOM). An SOM algorithm is a type of an unsupervised neural network that produces a low-dimensional representation of a higher dimentional input space while preserving its topological properties. This representation is called a map. Once the map is created we use social media data from human sensor networks together with other ground observations to determine which cluster represents the oil plume. Submitted by: Oleg Aulov, University of Maryland Baltimore County, aulov.oleg@gmail.com

Using NASA Remote Sensing Data in a Geographical Information System

Ross Bagwell, Francis Lindsay, Christopher Lynnes, Long Pham, Wenli Yang, Peisheng Zhao, Aijun Chen, MuQun Yang

NASA’s Earth Observing System Data and Information System (EOSDIS) generates more than 2 Tb of remotely sensed data each day through multiple spacebasedinstruments and satellite platforms. The Earth Science Data Information Systems (ESDIS)project at the NASA Goddard Space Flight Center (GSFC) is focused on expanding the usage ofEOS data in GIS applications, for both scientists and the general public – especially whenscience quality satellite products are readily obtainable in HDF-EOS format. The primaryformats for NASA’s EOS data are NetCDF, HDF4 (HDF-EOS2), and HDF5 (HDF-EOS5), ofwhich the Federal Geospatial Data Committee (FGDC) only endorses NetCDF (out of a total of64 external standards).The benefit of using a GIS includes the ability to interrelate multiple typesof information assembled from a variety of sources to visualize, query, overlay, and analyze data,making it valuable to a wide range of scientific, academic and private entities. Some of the issuesfacing the remote sensing community for using these data include:

  • Most GIS systems do not readily process or are unable to utilize NASA Remote Sensing (RS)data
  • Many scientific users utilize specialized software to geolocate images, which presents aproblem for interoperability between common systems- Headers in data files are not easily read by GIS systems
  • Key NASA datasets are mostly available in either HDF-EOS or NetCDF formats
  • GeoTIFFs cannot be directly created from HDF or NetCDF, creating a multi-step process thatis not inherently user friendly (including reprojection, band extraction, and exporting)

With that in mind, EOSDIS has undertaken a number of steps toward aiding the use of these databy the broader GIS community:

  • Support raster data geometry and integration of EOS data into a GIS, with functions for imageprocessing, modeling, and spatial analysis
  • Leverage relationships throughout the GIS community to enable the use of NASA RS data on the most commonly used platforms, focusing primarily on NetCDF, HDF4, and HDF5
  • Meet the needs of scientists to enable them to use GIS as a tool to augment their work in thevarious communities by providing them with capabilities to translate, process, or analyze the datain a more common and cost-effective GIS that is more interoperable with other communities
  • Assist the GIS community with understanding how to access and utilize EOS RS data in morecommonly known and easily available GIS packages.

Submitted by: Ross Bagwell, NASA Goddard / CTS

Enhanced Collaborative Disaster Management Through Interoperable Data Visualization (Funding Friday 2011)

Karl Benedict (Earth Data Analysis Center, UNM) & Rafael Ameller (StormCenter Inc.)

Rapid access to shared data and information is the key to successful planning and response to disasters. Many of the complex geoinformation (GIS) systems used today exist as standalone islands that were not designed to be interoperable. Thus, many of today's advanced systems do not currently work together from an overall mission or joint perspective. Additionally, despite all the advances in satellite and environmental data acquisition, processing and distribution, the “last mile”, getting the data to the end user, is still the hardest part.

This poster presents the products of a collaboration between StormCenter Inc. and the Earth Data Analysis Center that was funded through the ESIP Federation's Funding Friday program in which general purpose time-enabled WMS services are packaged in KML for delivery specifically through the Envirocast® Vision™ Collaboration Module (EVCM) developed by StormCenter Inc., and also more generally through any client application that has implemented support for KML's temporal elements and WMS access model.

This project has resulted in:

  • Increased integration of Earth Science data products into disaster planning and management through expansion of the data and products that may be integrated into the EVCM
  • Increased system performance in the collaboration environment through the packaging of large data sets (potentially multi-TB in size) into KML with embedded WMS – delivering targeted map images instead of entire data sets
  • Demonstrated the utility of integrating KML-wrapped WMS into the existing system – increasing the utility of published WMS services coming out of the Earth Science community (e.g. NASA NEO, NOAA NGDC)

Submitted by: Karl Benedict, Earth Data Analysis Center, UNM, kbene@edac.unm.edu

Skolr Digital Poster Service: from concept to service

Bruce Caron

The Skolr pilot project at the Summer 2011 ESIP Meeting proved the value of the Skolr concept: service the needs of meetings to realize more of the value of their poster sessions. This pilot also tested out Drupal as a development framework. The Skolr project is looking to build a robust digital poster service that any meeting can use to add value to their poster sessions. This poster will outline the lessons learned and the projected pathway forward to the fully implemented service. Submitted by: Bruce Caron, New Media Research Institute, bruce@tnms.org

Towards Natural Language Programming for Geospatial Analysis

Upendra Dadi, Liping Di

It takes huge amount of resources to master using geospatial analysis software tools. Even after mastering it, it is not uncommon for the users to have to refer to software documentation frequently for help in performing operations which are not carried out routinely. Ideally, a software should make it easy, especially for those who are already experts in the subject matter being studied, to perform operations without having to spend large amount of time learning the idiosyncrasies of the software. In this poster I will present some of the work that I am doing to create an Ontology based natural language interface to perform geo-spatial analysis. The system currently takes simple english sentences as input and turns the sentences into SPARQL queries which are applied over an RDF store. More complex queries could be build by combining several sentences together into a "paragraph". On receiving a query which is ambiguous, it searches for probable matches with the given query and ranks them for the users to choose the best match. This system gives a hint that it might be possible to build software for data analysis using natural language programming, at least in a limited domain. Some examples of natural language queries performed over Oceanographic Data will be presented.

Submitted by: Upendra Dadi, NODC, upendra.dadi@noaa.gov

CropScape

Liping Di, Weiguo Han, Zhengwei Yang, Meixia Deng

This poster/demo will show the CropScape, a standard-compliant web service system that analyzes, visualizes, customizes, and disseminates the Crop Data Layer (CDL) and other geospatial data from USDA Submitted by: Liping Di, Center for Spatial Information Science and Systems, ldi@gmu.edu

Learning about Climate Change and Human-Health Impacts with the CHANGE Viewer

Sneha Rao and Robert R. Downs

With growing concerns about change in climate and the impact it has on the environment we live in, it is important to develop resources for teachers, students and the general public to improve their understanding of global climate change.

Sponsored by NASA's Innovations in Climate Education initiative, the Center for International Earth Science Information Network (CIESIN) at the Earth Institute, Columbia University, and the Institute for the Application of Geospatial Technology (IAGT) at Cayuga Community College have developed an interactive tool known as the Climate and Health ANalysis for Global Education (CHANGE) Viewer.

The CHANGE Viewer is developed on a NASA World Wind Software Development Kit (SDK) that runs on Windows or MAC computers. Using existing web resources, such as the Population Estimation Service, a Web Processing Service (WPS) provided by the NASA Socioeconomic Data and Applications Center (SEDAC), and the Climate Mapper, a monthly climate observation plug-in developed for SERVIR by the University of East Anglia's Climate Research Unit (CRU) and IAGT, the CHANGE Viewer enables users to visualize estimates of people residing at global and local scales affected by change in climatic conditions. Submitted by: Robert Downs, CIESIN, Columbia University, rdowns@ciesin.columbia.edu

MODIS Web Services: Enabling Automated Standard Access to MODIS Science Data

Robert Wolfe, Ed Masuoka,Larry Gilliam,Ali Rezaiyan,Neal Most, Cid Praderas,Greg Ederer,Karen Horrocks,Gang Ye,Asas Ullah,Jeff Schmaltz

The teams that serve MODIS Atmosphere and Land Science Data products at NASA Goddard Space Flight Center are developing and deploying a suite of web services to simplify, standardize, and automate searching for and retrieving MODIS science data products, imagery, and metadata. These services include OGC Standard Web Coverage Service, OGC Standard Web Map Service and OpenSearch.org open search service.

Submitted by: Gregory Ederer, SigmaSpace Corp, gederer@sigmaspace.com

CEOS Atmospheric Composition Portal (Demo)

Stefan Falke, Frank Linsday, Chris Lynnes, Greg Leptoukh, Oleg Gousev, Severinne Bernonville, Wenli Yang, Peisheng Zhao, James Johnson

The Atmospheric Composition Constellation (ACC) and the Workgroup for Information Systems and Services (WGISS) within the Committee on Earth Observation Satellites (CEOS) are involved in development efforts supporting interoperability among the atmospheric composition research and applications communities. The initial effort has resulted in a website prototype that uses a standards-based framework to provide access to remotely sensed atmospheric composition data, metadata and visualization and analysis tools. We are seeking partnerships with other atmospheric composition community members interested in connecting data products, data analytical tools or other capabilities. Please stop by the demonstration and visit http://wdc.dlr.de/acp/ for more information. Submitted by: Stefan Falke, Northrop Grumman, stefan.falke@ngc.com

An Elemental OPeNDAP Use-Case

Dave Fulker

I focus this poster on the principal motivations and rationale for employing OPeNDAP as a data-provision method, striving to boil the matter down to those points most essential for ESIP members. Submitted by: Dave Fulker, OPeNDAP, dfulker@opendap.org

Demonstrating preservation connections using OAI-ORE (Funding Friday 2011)

Ruth Duerr and Joe Glassy

The Open Archives Initiative - Object Reuse and Exchange (OAI-ORE) protocol was developed to enable the exchange of information about complex e-science objects. The purpose of this project is to determine whether or not such a protocol is capable of describing and making available information about digital data sets in the Earth sciences. To test these capabilities we intend to test OAI-ORE using provenance and context information for NASA's MODIS instrument. We intend to describe the entire suite of available information as described in the OAIS reference model and further explicated through the USGCRP's report Global Change Requirements for Long-Term Archiving and developing ESIP Provenance and Context Content Standard for at least one of the MODIS Level 3 data products and all of it's precursor products. The results of this test will be demonstrated through a very simple web site depicting the connections and describing any problems and successes encountered. Submitted by: Joseph Glassy, NTSG/FLBS Univ. Montana, um.glassy@gmail.com

Mine Your Data: GLIDER brings data mining to the masses (Demo)

Rahul Ramachandran, Sara Graves, Todd Berendes, Manil Maskey Information Technology and Systems Center, University of Alabama Huntsville

Satellite imagery can be mined to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of the ‘mined’ information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. Mining and extracting thematic information using satellite imagery is a non-trivial task that requires a user to perform complex sequence of steps.

UAHuntsville has developed GLIDER, a freely available tool that simplifies mining of satellite imagery. GLIDER provides a suite of image processing algorithms for imagery enhancement along with pattern recognition and data mining algorithms for both parametric and non-parametric information extraction. This poster will showcase some of GLIDER’s many features using four case studies. The first case study will focus on the use of false color composites to highlight and distinguish features of interest within satellite imagery such as smoke. The ability to apply any mathematical formulae on different spectral bands and visualize the result will be covered in the second use case. The third use case will employ unsupervised classification algorithms to segment the image into meaningful classes. The final use case will focus on supervised classification covering sample selection, creating workflows for training, testing and the final application to create a thematic map. Submitted by: Sara Graves, Univ of Alabama, Huntsville / DAARWG, sgraves@itsc.uah.edu

ECHO and ISO

Ted Habermann

Metadata from ECHO is being translated to ISO and we need your help! Please stop by! Submitted by: Ted Habermann, National Geophysical Data Center, ted.habermann@noaa.gov

Pre-Mission, Mission and Post Mission Data Management for NASA Field Campaigns

Michael Goodman NASA Marshall Space Flight Center; Danny Hardin, Matt He, Marilyn Drewry, Michele Garrett, Helen Conover, Will Ellett, Lamar Hawkins, Mary Nair, Sherry Harrison, Tammy Smith The University of Alabama in Huntsville

Field research campaigns are essential for observing and measuring actual Earth system phenomena and validating computer models that simulate Earth systems. Ultimately, field data have a wide variety of application in basic and applied research. Due to the nature of data collection during a field campaign the resulting data sets are discontinuous over the designated geographic region as well as in time. The management of aircraft based data must take these factors into consideration.

The Global Hydrology and Resource Center (GHRC) and IT researchers at the University of Alabama in Huntsville have participated in a number of NASA field campaigns since 1998. For example The Genesis and Rapid Intensification Processes (GRIP) experiment was a recent NASA Earth science field experiment conducted in summer 2010 to better understand how tropical storms form and develop into major hurricanes. NASA used the DC-8 aircraft, the WB-57 aircraft, and the Global Hawk Unmanned Airborne System (UAS) configured with a suite of remote sensing instruments used to observe and characterize the life cycle of hurricanes. This campaign capitalized on a number of ground networks, airborne science platforms (both manned and unmanned), and space-based assets.

Due to this history and expected participation in future campaigns; the GHRC is recognized as one of the main NASA data centers for this category of data. At the GHRC data from successive field campaigns are tied together through common procedures, consistent metadata, and archival systems making it easy to access data from instruments that have been employed across several missions. These data are also valuable when preparing for new field campaigns.

This poster presents the data management activities and strategies employed prior to the mission, during the mission and after a mission concludes. Submitted by: Danny Hardin, University of Alabama Huntsville, dhardin@itsc.uah.edu

Building a Climatology for Coastal Gap Winds and Resulting Ocean Upwelling Events

Ken Keiser, Xiang Li, Deborah Smith, Bruce Beaumont, Thomas Harper

Orographic gap features near coastlines can concentrate regional winds, resulting in increased wind speeds that can affect the local climate. When these focused jets occur at sea level, they can additionally produce localized cold-water upwelling events that can be of interest to research, commercial and military users. The DISCOVER team, a NASA/ MEaSUREs project, have developed an automated intelligent algorithm to detect gap wind and ocean upwelling events at gap locations globally, using Cross-Calibrated, Multi-Platform (CCMP) ocean surface wind product and Optimally Interpolated Sea Surface Temperature (OISST) product. This algorithm is being used to process historical data with the goal of generating a climatology of past and current identified events. The resulting information is being collected and managed by the Global Hydrology Resource Center (GHRC), a NASA DAAC, located in Huntsville, AL. Science expertise on the interpretation of the wind and sea surface temperatures and development of the algorithm is being provided by DISCOVER team members at Remote Sensing Systems in Santa Rosa, CA, and the University of Alabama in Huntsville. This poster presents an overview of the algorithm developed for gap wind and upwelling event detection, the application to selected gap locations around the globe, and the planned approach for providing the resulting climatology to data center customers. Submitted by: Ken Keiser, University of Alabama in Huntsville, keiserk@uah.edu

Traversing Data Relations Using ESIP Standards (Demo)

Jess Lacy, Ruth Duerr

The ESIP Atom Cast Specification promises a means to discover and access data and services. This demo shows a production application that is uses the ESIP Atom Cast Specification to traverse data relations and power a web application. Submitted by: Jess Lacy, National Snow and Ice Data Center, jess.lacy@nsidc.org

Climate Literacy and Energy Awareness Network (CLEAN)

Tamara Shapiro Ledley, Mark S McCaffrey, Anne U Gold, Susan M Buhr, Cathryn A. Manduca, Sean Fox, Karin Kirk, Marian Grogan, Frank Niepold, Susan Lynds, Cynthia Howell

The US Global Change Research Program and a consortium of science and education partners in 2009 concluded “climate change will bring economic and environmental challenges as well as opportunities, and citizens who have an understanding of climate science will be better prepared to respond to both.” In order for citizens to achieve that understanding there is a clear need to support teachers, students, and the public in becoming climate and energy literate and to enable them to make responsible decisions about the environment and energy use for themselves and for society. However, to pursue climate and energy literacy it is necessary to identify and access educational materials that are scientifically accurate, pedagogically effective, and technically robust, and to use them effectively.

The CLEAN Pathway (http://cleanet.org) is a National Science Digital Library (http://www.nsdl.org) project that is stewarding a collection of materials for teaching climate and energy science in grades 6-16. The collection contains classroom activities, lab demonstrations, visualizations, simulations and more. Each resource is extensively reviewed for scientific accuracy, pedagogical effectiveness, and technical quality. Once accepted into the CLEAN collection, a resource is aligned with the Climate Literacy Essential Principles for Climate Science, the AAAS Project 2061 Benchmarks for Science Literacy and other national standards. The CLEAN website hosts a growing collection of currently 300+ resources that represent the leading edge of climate and energy science resources for the classroom.

This poster will describe the avenues the CLEAN portal that can help educators improve their own climate and energy literacy and how to effectively integrate the climate and energy principles into their teaching; and the review process that can enable your climate and energy science and technology tools to be made available through the CLEAN portal. Submitted by: Tamara Ledley, TERC / Climate Literacy Network, Tamara_Ledley@terc.edu

Cloud Computing Use Cases

Rick Martin

As cloud computing becomes more commonplace, the technology becomes less important than the value delivered for mission and users. This poster will highlight some ESIP-relevant uses of cloud with linkages to mobile computing. Submitted by: Rick Martin, SAIC, richard.a.martin-2@saic.com

Provenance Collection and Display for the AMSR-E SIPS

H. Conover, B. Beaumont, A. Kulkarni, R. Ramachandran, K. Regner, S. Graves, M. Maskey, D. Conway

This project brings together a team of NASA and university researchers with expertise in NASA Earth science data systems, science algorithm development, and provenance collection/dissemination. The team is applying provenance collection and representation tools to the generation of NASA’s AMSR-E standard products, with an initial focus on sea ice products. The AMSR-E SIPS generates Level 2 and Level 3 data products from AMSR-E observations, which are key data sets for research in both the Climate Variability and Change andWater and Energy Cycle focus areas. Provenance and context will be presented to the AMSR-E data community via an interactive web application. An initial focus on Sea Ice processing has allowed the project to engage the Sea Ice science team and user community in customizing the provenance tools for NASA Earth science data. Submitted by: Manil Maskey, University of Alabama in Huntsville, mmaskey@itsc.uah.edu

Real-time Automated Cloud Classification from Live Webcams

Alexander Matus

In the past decade, webcams have become increasingly popular for environmental monitoring. Due to low manufacturing costs and flexibility, outdoor webcam images are currently freely available online for most locations in the United States. The wide availability of webcam images provides a valuable, low-cost resource for weather observations. This study aims to extract real-time quantitative meteorological data from digital imagery. All images are obtained from the rooftop cameras atop the Atmospheric, Oceanic, and Space Sciences (AO&SS) building at the University of Wisconsin-Madison in Madison, WI. First, a feature mask is applied to filter out the land surface below the skyline. An RGB color histogram is generated based on only full-sky pixels. An algorithm is applied to filter out clear sky pixels and classify cloud in the image. Finally, a cloud type classification is performed based on the spectral signature of the RGB color histogram, in which the classification is validated based on satellite measurements. The entire process is completely automated through computer programming. The process of sky classification is a fast, inexpensive, and robust form of weather observation. This technique can be applied to all webcam images to perform a quick classification of sky conditions. Submitted by: Alexander Matus, Digital Earth Watch, amatus@wisc.edu

Reference Model for Disaster Management

Karen Moe and John Evans

The Committee on Earth Observing Satellites (CEOS), as the satellite arm of GEOSS, provides decision makers access to remote sensing products in support of disasters. The proposed reference model provides an enterprise perspective for managing distributed systems and services for disaster management. (poster presented at AGU FM11). Submitted by: Karen Moe, NASA ESTO, karen.moe@nasa.gov

ESIP Teacher Workshops

Margaret Mooney and Nina Jackson

This poster will map out the history and accomplishments of the ESIP teacher workshops. Submitted by: Margaret Mooney, CIMSS/SSEC/UW-Madison, margaret.mooney@ssec.wisc.edu

GEO User Requirements Registry (Demo)

Gary Foley, EPA; Hans-Peter Plag, University of Nevada, Reno; Gregory Ondich, Justin Kaufman, and Ric Blackman, SCG, Inc.

To achieve its goal to be user-driven, building the Global Earth Observation System of Systems (GEOSS) must be guided by a set of explicitly known user needs. At the core of GEOSS is the GEOSS Common Infrastructure (GCI), which includes registries that enable users of Earth observations (EOs) to search, discover, access, and use the data and services available through GEOSS. Three of these registries focus primarily on the contributors to GEOSS. The fourth registry, the User Requirements Registry (URR), is a database for the collection, sharing, and analysis of user needs and EO requirements. The URR also provides a means for efficient dialog between users and providers. The URR is a comprehensive database describing an array of user data, such as user types, applications, requirements, research needs, and technology needs. The novel concept fo the URR is in the information captured in the Links form, where relationships between entries in the other forms can be published, including descriptions of the societal benefits of the link and implementation status. Submitted by: Gregory Ondich, SCG, gondich@scgcorp.com

How to Cite an Earth Science Data Set

Mark A. Parsons and the Preservation and Stewardship Cluster

Creating a great data set can be a life’s work (consider Charles Keeling). Yet, scientists do not receive much recognition for creating rigorous, useful data. At the same time, in a post “climategate” world there is increased scrutiny on science and a greater need than ever to adhere to scientific principles of transparency and repeatability. The Council of the American Geophysical Union (AGU) asserts that the scientific community should recognize the value of data collection, preparation, and description and that data “publications” should “be credited and cited like the products of any other scientific activity.”

Currently, however, authors rarely cite data formally in journal articles, and they often lack guidance on how data should be cited. The Federation of Earth Science Information Partners (ESIP) Preservation and Stewardship Cluster has been working this issue for some time now and has begun to address some of the challenges.

Overall, scientists and data managers have a professional and ethical responsibility to do their best to meet the data publication goals asserted by AGU. This talk outlines a data citation approach to increase the credit and credibility of data producers. Submitted by: Mark Parsons, National Snow and Ice Data Center, parsonsm@nsidc.org

Create Collaboratories for Earth Science using Talkoot (Demo)

Rahul Ramachandran, Manil Maskey, Ajinkya Kulkarni, Helen Conover, U. S. Nair, S. Movva

Advances in technology allow different research groups and institutions to use new software tools to build and support virtual collaborations and infuse open science. The infusion of these tools into science processes can now enable sharing and publishing of digital scientific artifacts, and, dramatically improve knowledge sharing amongst researchers. These new tools offer the potential to create new virtual research collaboration platforms. Based on scientific interest, these new virtual research collaborations can cut across traditional boundaries such as institutions and organizations. This poster describes Talkoot, a software toolkit designed and developed by the authors. Talkoot provides Earth Science researchers a ready-to-use knowledge management environment and an online platform for collaboration. Talkoot allows Earth Science researchers a means to systematically gather, tag and share their data, analysis workflows and research notes. These Talkoot features are designed to assist rapid knowledge sharing within a virtual community. Talkoot can be utilized by small to medium sized groups and research centers, as well as large enterprises such a national laboratories and federal agencies. Submitted by: Rahul Ramachandran, ITSC, rramachandran@itsc.uah.edu

Engaging Climate Change Learners in Public School Settings (Funding Friday 2011)

Jesse Roberts, University of Wisconsin - Madison

Teaching students about energy, carbon emissions, and climate change requires a fundamental knowledge of what we use energy for in our society. It also requires that students understand the relative magnitudes of our current energy sources, and the true environmental cost of using each source. Often, this information is lacking, as it falls outside traditional classroom standards for earth science. A short card game is being developed which tries to convey what we use energy for in the world, and where that energy comes from. It is hoped that after only a few games that students will have a better appreciation for where our energy comes from and where it goes, allowing for a more nuanced discussion on energy efficiency goals and renewable energy development. This FUNding Friday project pulls data from US, EU, and UN sources, and offers a fine-grained breakdown of energy use by sector. Students are able to play the world as-is, as well as make policy decisions (Ban SUVs, eg.) and explore the costs and benefits of doing so. Submitted by: Jesse Roberts, University of Wisconsin - Madison, joroberts@wisc.edu

Linking Open Research Data for Earth and Space Science Informatics (Funding Friday 2011)

Eric Rozell and Tom Narock

Earth and Space Science Informatics (ESSI) is inherently multi-disciplinary, requiring close collaborations between scientists and information technologists. Identifying potential collaborations can be difficult, especially with the rapidly changing landscape of technologies and informatics projects. The ability to discover the technical competencies of other researchers in the community can help in the discovery of research partnerships. In addition to collaboration discovery, this data can be used to analyze trends in the field, which will help project managers identify emerging, irrelevant, and well-established technologies and specifications. This information will help keep projects focused on the technologies and standards that are actually being used, making them more useful to the ESSI community. We present a two-part solution to this problem: a pipeline for generating structured data from ESSI abstracts and an API and Web application for accessing the generated data. We use a Natural Language Processing (NLP) technique, Named Entity Disambiguation, to extract information about researchers, their affiliations, and technologies they have applied in their research. The extracted data is encoded in the Resource Description Framework using Linked Data vocabularies, including the Semantic Web for Research Communities ontology and the Friend-of-a-Friend ontology. The data is exposed in four ways: a SPARQL query-able endpoint, linked data, Java APIs, and a Web application. We also capture the provenance of the data transformations using the Proof Markup Language, including confidence scores from the NLP algorithms used. Our implementation has used only open source solutions, including DBPedia Spotlight and OpenNLP. We plan to set up an open source project for this work so that it can continue to evolve through community contributions. Submitted by: Eric Rozell, Rensselaer Polytechnic Institute, rozele@rpi.edu

Digital Earth Watch (Funding Friday 2011)

Annette Schloss, Jeff Beaudry, John Pickle, Fabio Carrera

Digital Earth Watch (DEW) involves individuals, schools, organizations and communities in a systematic monitoring project of their local environment, especially vegetation health. The program offers people the means to join the Picture Post network and to study and analyze their own findings using DEW software. A Picture Post is an easy-to-use and inexpensive platform for repeatedly taking digital photographs as a standardized set of images of the entire 360° landscape, which then can be shared over the Internet on the Picture Post website. This simple concept has the potential to create a wealth of information and data on changing environmental conditions, which is important for a society grappling with the effects of environmental change. Picture Posts may be added by anyone interested in monitoring a particular location. The value of a Picture Post is in the commitment of participants to take repeated photographs - monthly, weekly, or even daily - to build up a long-term record over many years. This poster will show examples of Picture Post pictures being used for capturing seasonal plant phenological events and a community project restoring a pond shoreline. DEW is being developed by a collaborative effort led by the University of New Hampshire with the Federation of Earth Science Information Partners, the University of Southern Maine, and Worcester Polytechnic Institute. We invite individuals, schools, informal education centers, groups and communities to join: visit us at http://picturepost.unh.edu Submitted by: Annette Schloss, Univ New Hampshire, annette.schloss@unh.edu

NASA's Global Change Master Directory's Discover and Access Earth Science Data Sets, Related Services, and Climate Diagnostics

Alicia Aleman, Lola Olsen, Scott Ritz, Michael Morahan, Laurel Cepero, Tyler Stevens

NASA's Global Change Master Directory provides the scientific community with the ability to discover, access, and use Earth science data, data-related services, and climate diagnostics worldwide. The GCMD offers descriptions of Earth science data sets using the Directory Interchange Format (DIF) metadata standard; Earth science related data services, oare described using the Service Entry Resource Format (SERF); and climate visualizations are described using the Climate Diagnostic (CD) standard. The DIF, SERF and CD standards each capture data attributes used to determine whether a data set, servicer climate visualization is relevant to a user's needs. Metadata fields include: title, summary, science keywords, service keywords, data center, data set citation, personnel, instrument, platform, quality, related URL, temporal and spatial coverage, data resolution and distribution information. In addition, nine valuable sets of controlled vocabularies have been developed to assist users in normalizing the search for data descriptions. An update to the GCMD's search functionality is planned to further capitalize on the controlled vocabularies during database queries. By implementing a dynamic keyword "tree", users will have the ability to search for data sets by combining keywords in new ways. This will allow users to conduct more relevant and efficient database searches to support the free exchange and re-use of Earth science data. http://gcmd.nasa.gov/ Submitted by: Tyler Stevens, NASA Global Change Master Directory, Tyler.B.Stevens@nasa.gov

Retrospective analog year analyses using NASA satellite data, a metric of improvements to USDA world agricultural estimates

William Teng, Harlan Shannon

The USDA World Agricultural Outlook Board (WAOB) is responsible for monitoring weather and climate impacts on domestic and foreign crop development. One of WAOB’s primary goals is to determine the net cumulative effect of weather and climate anomalies on final crop yields. To this end, a broad array of information is consulted, including maps, charts, and time series of recent weather, climate, and crop observations; numerical output from weather and crop models; and reports from the press, USDA attachés, and foreign governments. The resulting agricultural weather assessments are published in the Weekly Weather and Crop Bulletin, to keep farmers, policy makers, and commercial agricultural interests informed of weather and climate impacts on agriculture. Because both the amount and timing of precipitation significantly impact crop yields, WAOB often uses precipitation time series to identify growing seasons with similar weather patterns and help estimate crop yields for the current growing season, based on observed yields in analog years. Although, historically, these analog years are identified through visual inspection, the qualitative nature of this methodology sometimes precludes the definitive identification of the best analog year. One goal of this study is to introduce a more rigorous, statistical approach for identifying analog years. This approach is based on a modified coefficient of determination, termed the analog index (AI). The derivation of AI will be described. Another goal of this study is to compare the performance of AI for time series derived from surface-based observations vs. satellite-based measurements (NASA TRMM and other data). Five study areas and six growing seasons of data were analyzed (2003-2007 as potential analog years and 2008 as the target year). Results thus far show that, for all five areas, crop yield estimates derived from satellite-based precipitation data are closer to measured yields than are estimates derived from surface-based precipitation measurements. Work is continuing to include satellite-based surface soil moisture data and model-assimilated root zone soil moisture. This study is part of a larger effort to improve WAOB estimates by integrating NASA remote sensing observations and research results into WAOB’s decision-making environment. Submitted by: William Teng, NASA GES DISC (Wyle IS), William.I.teng@nasa.gov

NEON: Transforming Environmental Data into Information for Societal Benefit

Brian Wee

The National Ecological Observatory Network (NEON), or the Observatory, is a NSF funded national investment in physical and information infrastructure. The Observatory’s goal is to enable understanding and forecasting of the impacts of climate change, land use change and invasive species on continental-scale ecology by providing physical and information infrastructure to support research, education and environmental management in these areas. NEON provides vetted and authoritative data and information to scientists, educators, decision makers and the public on how land use, climate change and invasive species affect biodiversity, disease ecology, and ecosystem processes. NEON high-level data products are designed to enable ecological forecasts and analyses at a continental scale and facilitate the observation of decadal scale changes against a background of seasonal-to-interannual variability. We foresee that NEON’s partners will utilize these products as input to advanced models that will help inform resource management, socio-economic analyses, environmental risk management, and decision support for climate change mitigation and adaptation. Submitted by: Brian Wee, NEON, Inc., bwee@neoninc.org

Development and Implementation of NASA ISO Geographic Metadata

Benjamin White, Evelyn Nakamura, Ted Haberman

The International Organization for Standardization/Technical Committee on Geographic Information/Geomatics' (ISO/TC211) standard for geographic metadata, ISO 19115, was finalized in 2003. Since then, its use has grown both within the US and globally. After an extensive review process, NASA's Earth Science Data and Information System (ESDIS) Project determined that its stakeholders would benefit if it were also to adopt ISO 19115 (Aleman, et al. 2011). This poster explores the impacts of adopting ISO 19115, how the standard's implementation will effect the flow of metadata within ESDIS and some of the initial steps being explored in the implementation process. Submitted by: Benjamin White, Raytheon, Benjamin.White-NR@raytheon.com

Geoportal Server & Portal for ArcGIS: Disambiguation

Christine White, Esri

'Portal' has been a buzzword of late, and there are many projects using geoportal technology. This poster describes two well known Esri portal products: The Esri Geoportal Server, and the Portal for ArcGIS. It will outline features of each, use cases for choosing which technology, and important example implementations. Submitted by: Christine White, Esri, cwhite@esri.com

Picture Post Newsletter: An Opportunity for Outreach

Haley F Wicklein, Annette L Schloss

The Picture Post environmental monitoring project is a citizen science initiative funded byNASA to create opportunities for informal and formal science educators and thecommunity‐at‐large to collaborate by sharing digital photographs from Picture Post sites.The purpose of our project was to develop a newsletter to serve the Picture Postcommunity. The newsletter includes information about the Picture Post website, examplesof how members are using images from their Posts, and a space for the community torespond and offer suggestions. The newsletter arose from requests for feedback by picturepost users, and will provide an opportunity for outreach to and interaction within thecommunity. Submitted by: Haley Wicklein, Digital Earth Watch, hwicklein@abermail.sr.unh.edu

Service, Dataset, and Event Casting (Demo)

B. Wilson, G. Manipon, A. Kulkarni, R. Ramachandran, K. Keiser, S. Graves

Demonstrate a variety of Casting and Discovery interfaces: 1) A smart authoring tool to write service casts 2) A one-stop search box for smart discovery of datasets and services across casts, GCMD and ECHO 3) Faceted navigation and drill-down for Earth Science datasets Submitted by: Brian Wilson, Jet Propulsion Lab, Brian.Wilson@jpl.nasa.gov

Towards a Domain Specific Software Architecture for Scientific Data Distribution

Anne Wilson, Doug Lindholm

A reference architecture is a ‘design that satisfies a clearly distinguished subset of the functional capabilities identified in the reference requirements within the boundaries of certain design and implementation constraints, also identified in reference requirements.’ [Tracz, 1995] Recognizing the value of a reference architecture, NASA’s ESDSWG’s Standards Process Group (SPG) is introducing a multi-disciplinary science data systems (SDS) reference architecture in order to provide an implementation neutral, template solution for an architecture to support scientific data systems in general [Burnett, et al, 2011]. This reference architecture describes common features and patterns in scientific data systems, and can thus provide guidelines in building and improving such systems. But, guidelines alone may not be sufficient to actually build a system. A domain specific software architecture (DSSA) is ‘an assemblage of software components, specialized for a particular type of task (domain), generalized for effective use across that domain, composed in a standardized structure (topology) effective for building successful applications.’ [Tracz, 1995]. It can be thought of as relatively specific reference architecture. The ‘DSSA Process’ is a software life cycle developed at Carnegie Melon’s Software Engineering Institute that is based on the development and use of domain-specific software architectures, components, and tools. The process has four distinct activities: 1) develop a domain specific base/model, 2) populate and maintain the library, 3) build applications, 4) operate and maintain applications [Armitage, 1993]. The DSSA process may provide the missing link between guidelines and actual system construction. In this presentation we focus specifically on the realm of scientific data access and distribution. Assuming the role of domain experts in building data access systems, we report the results of creating a DSSA for scientific data distribution. We describe the resulting domain model and our efforts towards building a heterogenous, multi-’vendor’ architecture framework for data distribution based on that model. We draw on experiences and lessons learned supporting data access and distribution for multiple projects having common functionality but also unique details. References: [Armitage, 1993] Armitage, James, ‘Process Guide for the DSSA Process Life Cycle’, Software Engineering Institute, paper 240, http://repository.cmu.edu/sei/240, December, 1993. [Burnett, et al, 2011] Burnett, Michael, Weiss, Barry, Law, Emily, ‘NASA’s ESDS Reference Architecture’, AGU Fall Meeting, San Francisco, CA, December 2011. [Tracz, 1995] Tracz, Will, ‘DSSA (Domain Specific Software Architecture) Pedagogical Example’, ACM SIGSOFT Software Engineering Notes V20 N3, July 1995. Submitted by: Anne Wilson, LASP, anne.wilson@lasp.colorado.edu

Meta-Analysis of User Needs for Precipitation Data

Erica Zell, Adam Carpenter, Stephanie Weber

Precipitation data is needed for a diverse range of users and applications, and was highlighted as a priority observation for all Group on Earth Observation (GEO) Societal Benefit Areas analyzed under GEO Task US-09-01a. Many users have come to rely on precipitation data, whether historical data, near real-time data, or forecasts, and whether measured via rain-gauges, ground-based radars, or satellites. The users of precipitation data range from large hydro-meteorological services monitoring and forecasting weather, to health officials forecasting malaria outbreaks and private sector insurance specialists helping farmers manage their risk. Our study team conducted a literature review and engaged with GEO Communities of Practice and other organizations such as the World Meteorological Organization to assess users’ required characteristics (e.g., spatial and temporal resolution, accuracy, and timeliness) of precipitation data. The study results identify commonalities in need across user types, and test case study scenario configurations of observing systems to meet common needs. This analysis serves as a prototype in collecting user needs for a given observation, and could be expanded to include a larger number of user group consultations and/or to focus on observation priorities other than precipitation. Submitted by: Erica Zell, Battelle, zelle@battelle.org