Difference between revisions of "Question Nine"

From Earth Science Information Partners (ESIP)
 
(One intermediate revision by the same user not shown)
Line 2: Line 2:
  
 
9. “Crowdsourcing” volunteer data input from citizens can extend/improve/correct data records. How can agencies best gather these data? How do agencies resolve quality issues and other concerns when using crowdsourced data?
 
9. “Crowdsourcing” volunteer data input from citizens can extend/improve/correct data records. How can agencies best gather these data? How do agencies resolve quality issues and other concerns when using crowdsourced data?
 +
 +
 +
Crowdsourcing requires some seeding for the crowd to emerge (since the initial crowd might be small). Crowdsourcing can provide better coverage in remote areas of discontinuous events.  There is an opportunity (need) to train some volunteers for quality control (e.g., use key crowd members as editors/monitors) for quality control. Human moderation is needed to reduce spam and eliminate inappropriate content.
 +
 +
There has been little support (in terms of budget) to test out this new data collection technique.
 +
 +
******
 +
"Crowdsourcing" requires considerable organizational efforts and discipline on the part of the agency.
 +
 +
To do this right agencies should take a close look at Web 2.0 crowdsourced projects such as Wikimedia and the "Encyclopedia of Life" at the Smithsonian. The crowd is really many levels of users (some more active than others). So agency peer review of the contents can be supplemented by editing using super-users.

Latest revision as of 16:51, January 7, 2009

Enter your discussion report-out below:

9. “Crowdsourcing” volunteer data input from citizens can extend/improve/correct data records. How can agencies best gather these data? How do agencies resolve quality issues and other concerns when using crowdsourced data?


Crowdsourcing requires some seeding for the crowd to emerge (since the initial crowd might be small). Crowdsourcing can provide better coverage in remote areas of discontinuous events. There is an opportunity (need) to train some volunteers for quality control (e.g., use key crowd members as editors/monitors) for quality control. Human moderation is needed to reduce spam and eliminate inappropriate content.

There has been little support (in terms of budget) to test out this new data collection technique.

"Crowdsourcing" requires considerable organizational efforts and discipline on the part of the agency.

To do this right agencies should take a close look at Web 2.0 crowdsourced projects such as Wikimedia and the "Encyclopedia of Life" at the Smithsonian. The crowd is really many levels of users (some more active than others). So agency peer review of the contents can be supplemented by editing using super-users.