Question Nine
From Earth Science Information Partners (ESIP)
Revision as of 15:07, January 7, 2009 by Bcaron (talk | contribs) (→Enter your discussion report-out below:)
Enter your discussion report-out below:
9. “Crowdsourcing” volunteer data input from citizens can extend/improve/correct data records. How can agencies best gather these data? How do agencies resolve quality issues and other concerns when using crowdsourced data?
Crowdsourcing requires some seeding for the crowd to emerge (since the initial crowd might be small). Crowdsourcing can provide better coverage in remote areas of discontinuous events. There is an opportunity (need) to train some volunteers for quality control (e.g., use key crowd members as editors/monitors) for quality control. Human moderation is needed to reduce spam and eliminate inappropriate content.
There has been little support (in terms of budget) to test out this new data collection technique.