Citizen Science Design: Snowpack

This project built on our earlier work in mercury investigations in a number of way. Once again, Dr. Sarah Nelson served as the lead scientist on the project. This time, she also served as principal investigator. (I had served as PI on the earlier projects.) Her early work with mercury in ecosystems had focused on mercury deposition in winter months.  She wanted to return to that work, but recognized that data about the depth of the snowpack over the course of the winter is collected in only a few locations. She recognized that as climate change accelerates in the Northeast, a more fine-grained picture of snowpack data would be useful. Could students and teachers help with that? Other connections to previous work included:

  • Movement from expensive data collection (the cost of laboratory analysis of THg in macroinvertebrates) to less expensive (measurement of snowpack depth and mass and new snow amounts).
  • Movement from research-driven questions (where scientists’ interest in the data that students can collect may be relatively short-term) to environmental monitoring, where interest tends to be more persistent.
  • Making use of the geographic distribution of schools.
  • Using the web to enable schools to share data both to see that they are part of something bigger and to explore questions about how things differ from place to place.


Some of our conjectures are implicit in the bulleted items, above. Specifically, we expected that by moving toward a less expensive means of collecting data and toward a question where there would be ongoing interest in the data, we could create a more sustainable a citizen science program in schools.  In addition, we hoped that:

  • By focusing on changes in snowpack, which is connected to so many other systems (stream flow, soil and water chemistry, fish migration, recreation, drinking water, flooding, and so on), we would be collecting data and supporting field experiences that could be used as part of science teaching in a variety of scientific disciplines.
  • By making teachers aware, from the outset, that the intent was to spread the program to other teachers, and by engaging them in the design of procedures to do that, we sought to create a community of teachers who would sustain the activity at some level over time.
  • By bringing together a group of scientists at the University of Maine who can make use information about variations in snowpack across the state, we sought to create more awareness and use of the data that the students were collecting and to bring teachers and students into contact with a greater variety of uses of their data. Thinking again about sustainability, we hoped that this engagement of other scientists might help reduce the burden of communication with teachers and students on any single scientist.


  • A three year project where, each year, we expanded the number of middle and high school teachers involved in the project. The first cohort (people we had worked with before) worked with staff to identify barriers to use of field inquiry and project based learning by other teachers.
  • Expanded to include schools and field sites from southern Maine up to Aroostook  County.
  • Summer programs focused on protocol training, snowpack background, and pedagogy. 5-6 video conferences over the course of the school year, each involving a presentation from a different scientist.
  • Students were expected to sample snowfall and snowpack over the course of the school year.
  • Teachers used the core activity in different ways within the subject matter they taught.
  • During final year, strong focus in meetings with teachers on how to continue the work beyond the end of funding, including a final summer meeting intended to serve as the “launch” of the next program phase, where teachers would continue to work with each other with some support from non-profit partners (e.g., ongoing website maintenance), but without regular staff support to arrange meetings, keep in touch with everyone, and so on.

What We Learned

Data Quality

  • Teachers encountered substantial difficulty in ensuring / maintaining the quality of the data that students collected
    • Simple inaccuracies in reading rulers, weighing samples
    • Keeping students from walking on what they were trying to measure
    • Readings day after day when it was cold
    • Accommodating school schedules
    • New snow on weekends and holidays
  • Most of the teachers felt that the data were highly useful for student work (many of the teachers used the project to focus on data literacy skills), but had less confidence is their utility to scientists and others who needed to depend on the accuracy of the data.
  • But … teachers did make progress in improving data quality as they learned what students could and could not be depended on to do, and as they developed procedures to manage students’ involvement and commitment to the data collection activities.
  • Many teachers found that they ended up needing to enter the data into spreadsheets and to upload it to the shared repository for a variety of reasons, including students’ lack of familiarity with use of spreadsheet software.
  • The experience encouraged us to think in terms of using automatic, electronic data collection when possible (e.g., HOBO data loggers, but also simply a time lapse camera to take a daily picture of snow depth against a ruled measurement stick.)
  • The experience also sensitized us to the difference between one-shot or occasional sampling and measurement (not difficult to have students do this well) and measurements that have to continue frequently over time (more difficult).
  • Finally, thinking back to the mercury work, it sharpened our thinking about the difference between having students collect samples (so that data about the samples can be collected in a lab) and making field measurements.


  • Visiting classrooms, working with students and teachers, answering student emails, and participating in PD requires more time than most scientists can spare. Doing this work makes sense only when a scientist is personally interested in pedagogy and involvement by teachers and students, and, even then, it can be difficult for the scientist to justify spending much time on such matters.
  • We were able to involve a number of scientists in ways that demanded less in the way of time commitment, but our experience was that these scientists were the ones whose continued participation depend most directly on the quality of the data.
  • We came away from the project thinking about how to distinguish between the costs of starting up this kind of activity (high) and the costs of maintaining the program (perhaps with much less direct engagement by scientists) once it is up and running.

Sustaining the Program

  • At the end of the project we did succeed in identifying a core group of teachers who were interested in sustaining the activity.
  • Some of these teachers have continued to collect snowpack data and a few even continue to upload it to the shared repository, but the teachers have not continued to meet periodically as they has hoped to do.
  • This suggest to us that some level of continued staff support–to keep in touch with teachers and to periodically set up meetings–is essential, confirming Kania and Kramer’s (2011) conjectures about the importance of having a “backbone” organization to support collective work.

Other Outcomes

Our three years of work with the SnowPack project stimulated some thinking about what it was that made “authentic science learning” authentic from the students’ point of view. Since involving scientists directly is difficult and expensive, how much involvement is necessary?  Was the idea of doing work that someone else needed the thing that mattered? Or was it more important, or at least sufficient, that the work was clearly useful locally?