UsabilityCluster/MonthlyMeeting/2017-02-01 MeetingNotes
From Earth Science Information Partners (ESIP)
Attendees: Ruth Duerr, Bob Downs, Nancy Hoebelheinrich, Reid Boehm, Tamar Norkin, Ward Fleri, Shannon Rauch, Bruce Caron, Madison Langseth, Sophie Hou
1. Recap of ESIP Winter Meeting and Usability Cluster Session
- Three different use cases were tested.
- Data Management Training Clearinghouse was tested using cognitive walkthrough.
- The presentation for this evaluation is available at the following link: http://wiki.esipfed.org/images/c/c6/ESIP_WinterMeeting_UsabilitySession_2017.pdf
- Feedback for this demonstration included: how to pair this technique with others and who should be responsible in performing the evaluation.
- Data Conservancy Packaging Tool was tested using virtual user study.
- Feedback for this demonstration included: having an expressive user is really helpful for enhancing the effectiveness of the test, combining multiple evaluation techniques could highlight/confirm different types of usability issues, and it is helpful to explore the mapping of mental model and the design of the system.
- BCO-DMO was tested using moderated, group user study.
- This is not an “official” technique; it is a technique that Sophie and Nancy H. (Chair of Data Management Training Working Group) adapted by combining focus group/interview techniques with user study technique.
- In addition to selecting a user to think aloud; a feedback form was also created to ask attendees to record their own observations and reactions.
- While we were able to collect additional feedback through the moderated, group user study, the moderator noticed that some actions/observations might have been missed. As a result, if this technique were to be used again, additional observers might need to be available to record the feedback.
- Question: was this technique effective/helpful for BCO-DMO?
- Answer: Yes; it would of course have been nice if more time was provided, but the experience showed that many usability issues could be discovered with quick tests.
2. Wireframe/Mock-Up/Prototype Presentation
- Three different ways that results from user studies could be evaluated and prioritized for implementation or further studies.
- Wireframe:
- Key characteristics: low fidelity, cheaper/easier/less time consuming to create, and can help with quick demonstrations of ideas.
- Mockup:
- Key characteristics: mid to high fidelity, relatively cheaper/easier to create, and might be visual enough to perform a simple user study.
- Paper prototype:
- Key characteristics: similar to mockup, but can be more comprehensive to represent more areas of the system.
- Prototype:
- Key characteristics: mid to high fidelity, can be expensive/time consuming to create, but can be helpful with interactive user study.
- Question: Would a test site considered to be a prototype?
- Yes
- Question: Experiences with these techniques?
- Bruce - Paper prototyping
- Nancy - Wireframing/mocking up/ different input forms helped in clarifying the different understanding of terms used.
3. Brainstorm of Data Archive/Repository Service Areas that Could Benefit from Usability Evaluations
- Types of roles/personas:
- Producer (P)
- An entity that submits data to an archive/repository.
- User (U)
- An entity that applies data from the archive/repository to other purposes.
- Assessor (A)
- An entity that reviews data from the archive/repository to determine the performance of the archive/repository.
- Data archive/repository operator (O)
- An entity that works with data in the archive/repository to manage/sustain the archive/repository.
- Areas:
- Home page - PUAO
- Search - UAO
- Searching for data using a geospatial/map interface - UAO
- Browse - UAO
- Data ingest forms, including metadata input (initial information) - PO
- Metadata development (tools for editing, updating, managing, and curating metadata) - PO
- Adding new components to a pre-existing user interface - O(PAU) (Ended persona assignment here)
- Help documents
- Registration to a site or service
- Downloading data (accessing data)
- Access to an identified resource (e.g. dataset, software, etc)
- Dataset landing page; Collection landing page
- Contact request forms and such (i.e., asking for help on something)
4. Solicitations of possible use cases and speakers
- Usathon on a NASA data website (NASA can not solicit this type of information from users, but we could act as users for NASA).