Human Factors to Requirements

From Earth Science Information Partners (ESIP)
Revision as of 10:02, April 17, 2012 by Clynnes (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Feature or Reqt Barrier Enticement Note or Example
Support Private workspace Fear of being scooped; reluctance to make public incomplete results Might also be able to do this with a developer or test site
Easy-as-pie sharing Difficulty in using tool
Support Private groups Fear of being scooped; reluctance to make public incomplete results
Support team collaboration Helps for a Project to decide on this as its default way of working together
"My Contributions" NIH syndrome Ownership Bias Also a reputation builder. See View Contributions / sort by number of contributions
Follow a contributor Learn from experts; Social proof acadamia.edu.  May also want to follow projects, keywords, etc.
"Killer app" Status Quo bias Any ideas for the killer app that would serve to entice the users to modify how they work now? Automatic Proposal Generator? Hmmm...include RFCs, link work to RFCs; data mgmt plan tool; use AS ANSWER to data mgmt reqt. (Note: leverage distributed nature).  Also:  All-purpose Visualization Tool
"Official" contributions
Authority e.g., YouTube "Official" videos. Use for datasets, esp. Also used in Twitter for people--might also be used as data provider, maybe for researchers.
View contributors / sort by number of contributions Reputation Actually, probably need to factor in reuse by others (see horizontal provenance), or some other ranking. Need 'like/Reddit' button? Or perception of importance? Some of these work better for large populations.  Link to articles published and/or metric of said articles. Say, a lightweight "citation" metric (or reuse metric)
Prepopulation with content Positive Mimicry; Network Effect Solicit from ESIP partners, data providers.  Providers could get feedback on their data products and tools, esp. from unintentional users; incentive is to also show more usage of data. How to get science researcher prepopulated content?  Maybe replicate results from the literature?  Maybe good for educational stuff.  But what if you get a different result? Probably want to discuss with the author. Ergo we need point-to-point sharing. Need to be able to share results with people outside the collaboratory.
Use-time design Rigidity (prescribed functionality) See work by G Fischer, e.g. (Perhaps?) Evernote & its kin, Spreadsheets, Hypercard...
"Expected Behavior" / "Netiquette" Fear of violating unwritten norms; Fear of being scooped ??? Any codes we could model? (e.g. Slashdot, Quora, myexperiment.org, wikipedia). Also, how to enforce? Do we need an "I accept" agreement? Maybe have contributors set their expectations?  See Creative Commons for this. cf. IPY Norms of expected behavior
Focus the UI on the content and not the software
Ownership Bias See ESIP commons for example; also citations boost the importance of the content. (Minimum would be permalink).
"Horizontal" provenance Reputation; Ownership Bias That is, the system, or contributors, noting that their work is derived from other work, e.g., cloning workflows (e.g., Yahoo Pipes)
Quick-comment button (Like, Reddit, etc.) Reputation