UsabilityCluster/MonthlyMeeting/2016-11-02 MeetingNotes

From Earth Science Information Partners (ESIP)
Revision as of 15:52, November 2, 2016 by Sophisticus (talk | contribs)

Attendees: Mike Chapman, Bob Dattore, Danie Kinkade, Annie Burgess, Ken Casey, Bob Downs, Ruth Duerr, Shannon Rauch, Tamar Norkin, Ward Fleri, Madison Langseth, Sophie Hou


1. Discussion Questions for Usability Testing (from October)

  • Bob Downs: Familiar with focus groups and usability testing, both as the administer and subject. Using focus groups might be a less expensive way to go, especially if you can get together a group of people (less than 8 people) to complete tasks and discuss them. Since there would just be a one-time administration cost, it could be cheaper than engaging individuals to assess their behavior and the usability of the tools.
  • Question: What would be a recommended setup for a focus group?
  • Answer: Performing tasks on the spot has advantages of immediate feedback. If they perform tasks in advance, they may forget things, although they might remember the most salient points because they will stick out in their minds. If you can get them to do the tasks during the focus group session, even the small issues can be mentioned and those can start a discussion, which could turn into suggestions for improvement. Even though focus groups are a lot cheaper, there are also disadvantages: Groupthink or contagion - an individual may mention something, and others may want to jump on that issue. In doing so, other issues might be neglected. A particular member of the group may have a recommendation and others may simply agree without providing further feedback, whereas if you have individual user testing, you will get more of an independent comment from their personal experiences.
  • Sophie: With the DMT Clearinghouse, which was created in collaboration with USGS and DataONE, we performed usability testing sessions before launching the DMT Clearinghouse. We didn’t do individual testing due to lack of time and resources. We did two separate group usability testing sessions at DataONE User Group meeting and ESIP Summer Meeting. At each session, we asked people to go through the tasks as a group, but we also moderated the sessions. It was really important to allow each attendee an opportunity to provide his/her own feedback in a round-robin fashion to help avoid groupthink. This also gave each user an opportunity to talk about whether they agreed or disagreed. We found this hybrid mode of focused group-usability testing to be quite positive because we had people going back and forth discussing about whether the suggestions would be beneficial to the others in the group. Sophie originally had concerns about how well the method would work, but it really did work well.


2. Heuristic Evaluation (Sophie)

  • Could also be known as Usability Audit. The evaluation technique, does not involve users but a usability expert (familiar with usability issues and interface and the usability principles).
  • The principles used in the evaluation have been developed over time by other usability experts. The principles are good practices to adhere to for usability guidelines for interfaces.
  • 3 steps:
  • Planning - Determining what would you like to evaluate, gathering the set of principles, and finding the usability experts to apply the principles.
  • Inspection - 10-30 minutes to an hour or more for more complicated systems.
  • Reviewing the findings and reporting on improvements or actions to be taken.
  • 10 Principles (Jakob Nielsen)
  • Visibility of system status.
  • Match between system and the real world (e.g. error messages in program codes that many users don’t understand would not be very user friendly).
  • User control and freedom (i.e. having interaction with system that is natural to users’ workflow).
  • Consistency and standards (i.e. making sure everything in the system is described/presented coherently; don’t use multiple labels for the same action).
  • Error prevention (e.g. if a data entry field is expecting specific values, make the expected vocabularies available).
  • Recognition rather than recall (i.e. making sure that the workflow is clear and that people don’t need to think about what the system is asking them to do).
  • Flexibility and efficiency of use (e.g. if there are ways to provide shortcuts, such as saving work that you have completed and jump to this section, it would be help).
  • Aesthetic and minimalist design
  • Help users recognize, diagnose, and recover from errors (i.e. features need to be usable for the system to be useful; make sure people are comfortable seeing and interacting with the system).
  • Help and documentation
  • There are a number of other sets of usability heuristics principles (Arnie Lund, Bruce Tognazzini, Ben Shneiderman).


3. Cognitive Walkthrough (Sophie)

  • Can be performed by the development team, and doesn’t need to involve users or a usability expert (although experts are always helpful).
  • Looks at a series of tasks in a workflow and asking “Do these tasks or steps make sense?”
  • The person performing the cognitive walkthrough needs to take the perspective of the user.
  • Process developed by Wharton, Rieman, Lewis, and Polson.
  • Procedure:
  • Define the users and the goals.
  • Define the steps that the users would attempt and the criteria for success for each task.
  • Go over the tasks step-by-step through the lens of the user with the actual interface, and ask the following questions:
  • Will the user try to achieve the right effect?
  • Will they notice that the correct action is available?
  • Will the user associate the correct action with the effect to be achieved?
  • If the correct action is performed, will the user see that progress is being made?
  • Compare success criteria and the actual steps.


4. Example for USGS (Madison)

  • Presentation Title: Sciencebase Data Release Landing Pages Usability Study
  • Feedback from users and the related heuristics principles:
  • “Cluttered” screen layout = unnecessary cognitive load
  • This echos the heuristic principle of using aesthetic and minimalistic design.
  • Where is the data?
  • The label and positioning of the “related items” panel are not easily recognized by the users because users are expecting to see “attached files”. Also, in general, people are looking towards the left side of the page.
  • This shows having a design that is consistent to the users expectation can help with their interaction with the system.
  • Non-Descriptive, generic labeling
  • Having labels that do not match with the natural language of the users demonstrated that it could cause difficulty for users to understand the purpose/functions of the system.
  • In general, USGS’ study showed that:
  • Aesthetic and minimalistic design could help users in comprehending the systems.
  • Labels should be descriptive, consistence, and use languages that are natural to users.
  • Question: What are the comprises/trade-offs that the usability team had to make with the web design/development team?
  • Answer: Technical capabilities of the system might sometimes restrict the types of improvement that we could actually implement. For example, due the diversity of the types of resources that are in Sciencebase, a specific “dataset” label could not be applied to all the resources. Additionally, if the system is not inherently good at presenting hierarchical information, it could also be challenging to show individual items that are related to the collection.
  • Comments:
  • Ken and Ruth also echoed that underlying system design/architecture could affect the type of usability issues/improvement that could be achieved.
  • Design decision made early on about the “way that data is” which isn’t always the case, so it can take a really long time to re-architect things and changes things for the better.
  • Sophie: NCAR is building a new search and discovery system, and we chose the technical system based on the features/capabilities that we need. Luckily, the team knows about the importance of usability, and we are trying to look at usability at every stage of the development. Development team is really receptive to thinking about usability and are able to prioritize the development of certain elements based on the usability point of view and not just the technical capabilities. Hopefully, at the end of it, we will be able to minimize the usability issues down the road.


5. Other Items:

  • If applicable, the Usability Cluster can assist with reviewing the test script during our December meeting (12/7).