SolutionsUseCase VirtualObservatory neutraltemperature 1a
Return to: Use_Cases
Plain Language Description
Define the use case in plain sentences and wherever possible, avoid specifying technical solutions or implementation choices. Concentrate on the application aspects of the intended scenario. Also note when the use case may be applicable to more than one application area.
Search for and find a specific type of data in the CEDAR database and plot the data in a way that makes sense for the data type. At present a scientific user needs to know a lot about the types, locations and operating modes of instruments (also models and indices) to be able to locate, retrieve and use them. This use-case will demonstrate how ontologies, and semantically-enabled interfaces can significantly reduce the level of detail that a person has to know about the data.
Describe a scenario of expected use
A verbose (more detailed) description of one instance of a problem to be solved, what resources are generally needed (if known) and what a successful outcome and impact may be. In this case, who might be expected to do the work or provide the resources and who might be expected to benefit from the work. List any performance or metric requirements for this use case and another other considerations that a user would expect.
Definition of Success
Location of relevant data for a suitable time period and visual display of that data as a time series, or two-dimensional representation (contours) for e.g. as a function of time and altitude (height) in the atmosphere.
Formal Use Case Description
Use Case Identification
- Use Case Designation
- Use Case Name
- <Insert short name and long name>
- Prepared by:
- Luca Cinquini
- CISL/NCAR as part of the VSTO team
- August 1, 2005
- Version 1.2.a
- Modified by:
- August 1, 2005 Luca Cinquini - initial document
- August 3, 2005, Deborah McGuinness - mark up of language for initial document
- August 5, 2005, Peter Fox - conversion of format
- August 15, 2005, Peter Fox - update to format and content
- August 16, 2005, Luca Cinquini - updated Description section and added Process Model section, took first stub at VSTO tools section
- August 20-23, 2005, Peter Fox - clarified use-case description, added to vocabulary and VSTO ontology section, added references, preliminary classes (and instances) for instruments and classes for relevant parameters.
- August 25, 2005, Luca Cinquini - slight modification to use case description after phone call conversation with Peter Fox
- March 22, 2007, Peter Fox - conversion to ESIP mediawiki format
First paragraph is short description, second paragraph, etc. may contain further details.
Through this use case, the system User locates and identifies datasets (collections of related grandules) for use or processing. This process results in the User having access to a subset of the datasets in the portal that meet the requirements of the User. Individual datasets, and their constituent granules, may then be identified for further action or processing (e.g. Visualization, analysis, download).
- 1.Operation succeeds and user obtains QQQ.
- 1.Operation fails to return any XXX. Should instead YYYY.
- 2.Illegal input of AAA, Should instead ZZZZ
Schematic of Use case
A diagram of how the different elements and people/processes may fit together in the use case (if possible do not refer to specific technologies).
Use Case Elaboration
This section is intended to be completed with the details of the use case that are required for implementation. This section is not intended to be filled in by an application user.
Always identify primary actors, may be more than one. Also identify other actor including any other systems or services which are important. Primary actors are usually ones that invoke the use case and are beneficiaries of the result. They may be human or computer. They are actionable. Other actors are those that support the primary actor, i.e. would be part of the use case without the tasks, work flow, resource, or requirements implied by the needs of the primary actor.
The actor that initiates this use case is the portal User. Providers may also initiate this use case as a precursor to use case EIE05, Manage Collections/datasets.
Security actor, who authenticates the user requests and issues authorizations for access to relevant data/resources.
- 1.Collection metadata have been entered into the system
- 2.Collection metadata have been validated
- 3.Collection metadata have been published
- 1.Datasets or granules have been identified within the system for further action
- 2.Appropriate action (i.e. Map, download, process) controls have been provided to the user to initiate that action.
- 3.Controls are provided to the user to refine the criteria used to 'discover' the dataset.
Normal Flow (Process Model)
- 1)The user selects the 'dataset discovery' tool collection from the user interface
- 2)The user performs a 'simple' search using a simple interface that searches commonly queries dataset attribute fields for matching text/terms.
- 3)The results of the search are presented to the user with appropriate action controls associated with the datasets.
- 4)The user selects one of the action controls to 'use' the identified dataset(s) in a specified action (i.e. Visualization, download, processing)
- 1)The user selects the 'dataset discovery' tool collection from the user interface
- 2)The user selects a control that provides access to an advanced search tool that supports spatial, temporal, and parametric search methods. Flow then extends to EIE11-EIE14.
Special Functional Requirements
- <Cluster>.<SubArea>.<number>.<letter+1> something added or a variant.
E.g. AQ.Smoke.1.b something added or a variant
- <Cluster>.<SubArea>.<number>.<letter+2> something added or a variant
- <Cluster>.<SubArea>.<number>.<letter+3> something added or a variant
Use Case Diagram
Other Non-functional Requirements
Overall Technical Approach
The Millstone Hill Fabry Perot interferometer is operated by MIT in cooperation with the University of Pittsburgh. The interferometer is located near the Millstone Hill incoherent scatter radar at latitude 42 degrees 37 minutes North (42.62) and longitude 71 degrees 27 minutes West (-71.45). Mean local solar time differs from UT by -(4 hour 46 minutes). The local magnetic field has a 15 degree variation to the West and an inclination of 72 degrees.
Analysis of the data is based on the methods used at the University of Pittsburgh. The analysis is a three step process. First, all the data from the frequency stabilized laser are fit to a parameterized Airy function, producing a table of the instrumental parameters as a function of time throughout the night. Second, a parabolic numerical least squares fitting process is then used on the nightglow data, based on the measured instrumental parameters. This method gives 4 parameters: a doppler shift of the nightglow from the shift in the measured peak, a relative intensity of the nightglow from the signal integrated over the peak, an effective temperature of the neutral atmosphere from the doppler width of the measured spectrum and a continuum background signal from the baseline of the profile. In the third analysis step, the doppler shift of the nightglow line is interpreted in terms of a wind.
Log10 relative emission intensity (parameter 2506) is the integration of the fitted line profile over the free spectral range of the instrument. This is only a relative intensity parameter, intended for comparison of intensities during a single night, or perhaps over periods of a week or two. Changes or drifts in sensitivity are not removed from this number, so comparisons between different nights are not advised. An order of magnitude estimate of the calibration is 10 of these units per rayleigh, however, this is only a very rough approximation.
Typically, a 5 position scan is used: a vertical measurement and 4 measurements at an elevation of 30 degrees from the horizon looking at either the cardinal points or at 45 degrees from the cardinal points. Winds are calculated by measuring the difference in line position between the fitted line and a zero velocity reference. The zero velocity reference is generated by taking all the fitted lines from vertical measurements and smoothing and interpolating them as a function of time. This assumes that the vertical velocity is small compared to the resolution of the interferometer. For nights in which the quality of the vertical measurements is poor, or in which there are not enough vertical measurements for a good smoothed reference, the vertical measurements may be supplemented by an average of measurements in opposite directions. This assumes that the wind field is uniform over the observation points, i.e. without divergences. The method used to obtain the vertical reference is flagged by KINDAT (7001 and 17001 for vertical measurements only, 7002 and 17002 for combined measurements).
Uncertainties in the derived parameters are purely statistical, and do not reflect possible systematic errors. The wind uncertainty is calculated from the data by considering the accuracy of determination of the center of gravity of a line. Temperatures and temperature errors are estimated by taking a fourier cosine transform of the data and fitting the logarithm of the coefficients to a straight line.
When data are taken in the cardinal directions, the winds derived from the measurements are either geographic meridional or zonal. To get values of the horizontal winds at the same time, the measurements are smoothed and interpolated. When data are taken at 45 degrees to the cardinal points, two measurements from orthogonal directions are used to define a vector, from which both geographic and geomagnetic winds are determined. When there are significant latitudinal gradients, the winds are often determined from combining SE and SW and NE and NW so as to keep the latitudinal gradients intact.
References for the instrument and data processing procedures are:
- Biondi et. al., Appl. Opt. 24, 232, 1985.
- Hernandez, "Fabry-Perot Interferometers", Cambridge University Press, 1986.