Community-based Data Validation Practices in Citizen Science

In a study of the dynamics of citizen science data curation, researchers Andrea Wiggins and Yurong He identified several interesting trends impacting the perceived quality of observational data shared on the iNaturalist social network. The paper was nominated for an Honorable Mention and appears in the proceedings of the ACM CSCW 2016 conference. The study was motivated by the huge volumes of data now being produced in citizen science, which often requires some form of validation prior to its use. Since professional researchers are unable to keep up with quality management, Wiggins and He evaluated how well the citizen science participants are able to curate the data they contribute. The researchers based their findings on observations from an educator workshop and publicly accessible data from iNaturalist, using mixed methods to ground statistical analyses in actual practice. They extended a theoretical framework that focuses on information assessability, linking information provenance with reliability, or ability to evaluate changes to the record, and connecting information stewardship with informativity, or amount of detail in the record. While the researchers were not surprised to find that data about birds drew more attention than content about species, other findings were more unexpected. For example, observations for which the contributors asked for help with species identification were actually less likely to receive assistance. A more substantial concern was that observations uploaded from mobile phones were also less likely to be reviewed, suggesting that mobile devices may have some drawbacks for supporting this type of participation. Read the preprint of the article...

Privacy policy gaps in citizen science and participatory research projects

Researchers Anne Bowser and Andrea Wiggins surveyed 30 North American participatory research projects–volunteer-driven citizen science and participatory sensing efforts–to understand how well the projects handle privacy design and implementation. The paper appears in the August 2015 issue of the Human Computation Journal. The study used publicly available policies from the project websites and found that while most projects consider policies and technical designs to protect users, the levels of completeness varied, as did where and how the information was presented. Assessing project policies in greater depth uncovered gaps and discrepancies between intent and implementation. These observations resulted in a set of Ethical Guidelines for policy and technology design for the projects.  Data sharing is considered essential to the operation of these projects, but ensuring users are presented with clear statements and prompts at appropriate points in the participation process is important. The researchers recommended designing policies and tools to support protections of those involved in the research at any level, as well as compliance with legal requirements for protections.  The paper contributes a new framework for ethical engagement of volunteers in participatory research, synthesized from the principles of the Belmont Report and guidelines for privacy protections in participatory sensing, with recommendations around Ethical Engagement, Ongoing Assessment, Informed Participation, Evolving Consent, Participant Benefit, Meaningful Choice, and Evolving Choice.  ...

Citizen science on the Diane Rehm Show

Citizen science is clearly an idea whose time has come; I’ll be appearing on the Diane Rehm Show to discuss citizen science for the Environmental Outlook show on May 5. For those tuning in after the fact, the show will be archived online. Obviously, this is an amazing opportunity to share my passion for citizen science with (ulp!) 6-7 million listeners across the nation. I’ll be in the studio with Sharman Apt Russell, author of “Diary of a Citizen Scientist”, with call-in discussion from David Bonter of the Cornell Lab of Ornithology and Liz MacDonald of NASA and Aurorasaurus. I actually know David and Liz quite well–the citizen science world is small! Sharman and I also crossed paths at a conference a couple of years back, and I’m devouring her delightfully articulate book at the moment. It reminds me so much of my own dissertation fieldwork, studiously learning everything I could to identify birds and coming to understand the things I studied in so much more detail than I had previously imagined they could possess. I’m looking forward to the discussion with these fantastic panelists, and excited about the chance to talk about citizen science’s recent growth, achievements, and...

Speaking at DC Science Café on May 5

On May 5, I’ll be speaking at the DC Science Café on “the power of citizen science”, appearing on a panel with author Sharman Apt Russell and Aurorasaurus project director Liz MacDonald. It’s an exciting opportunity to talk about my research with the public! The challenge, of course, is that my usual audiences are 1) other academics, 2) grad students, and 3) public sector staff with substantial background in science. In other words, not the general public! There’s so much I could say about citizen science that it’s hard to know where to start. Further, it sort of sounds like slides aren’t the usual choice–and I can’t really recall the last time I did a talk without slides. So this is an interesting challenge, especially given that it’s the end of the semester and I’ve got a lot of other big events on the calendar for that week! The event is 6:30-8:30 PM at Busboys & Poets at K and 5th (in DC). I’m told that audience members should arrive early to get a seat, since their capacity is 150. The DC Science Café is a free monthly event organized by the DC Science Writers Association. I’ve heard of these events before, as a form of public engagement in science communication, but I’ve never attended one—and I certainly didn’t expect to be on stage for my first science café experience. It should be fun; come join the...

Citizen Science Evaluation & Planning

In collaboration with the Smithsonian Environmental Research Center (SERC), we are testing evaluation tools designed to establish contextually-appropriate means of evaluating scientific productivity. Due to the diversity of goals and practices, measuring science outcomes in citizen science projects requires a holistic approach, so we are developing evaluation and planning procedures suited to application across a variety of contexts. At SERC, we can work with a diverse range of citizen science projects to improve the robustness and generalizability of a science outcomes inventory and planning toolkit that will be useful to the broader citizen science community as well. Citizen science can be considered both a methodology and a phenomenon; in this study, we focus on its methodological characteristics through the contextualized evaluation and planning process. As a phenomenon, we focus on understanding these projects’ evolutionary patterns and the impact of key decisions on project development, addressing these research questions: What are the typical stages of project development and longitudinal patterns of project evolution in citizen science projects? What events or conditions influence project management decisions in citizen science projects? How does structured evaluation of project outputs support project evolution and decision making? The products of this research will include improved citizen science project management and evaluation processes. We also anticipate new insights into project dynamics and resource requirements that can be used to establish reasonable, evidence-based resource allocations and performance expectations for local-to-regional field-based citizen science projects, supporting more effective project management and improved...