Research to Practice, Practice to Research
Connected Science Learning September–October 2022 (Volume 4, Issue 5)
By Jessica Wardlaw, Ana Benavides-Lahnstein, Lucy Robinson, Julia Lorke, Sasha Pratt-Taweh, Maryam Ghadiri Khanaposhtani, Heidi Ballard, and Victoria Burton
Citizen science and other participatory approaches to science provide opportunities for the public to collaborate with professional scientists and make tangible contributions to authentic scientific research. Alongside their research goals, citizen science programs may also contribute to wildlife conservation (e.g., Ballard et al. 2017; Chandler et al. 2017), policy processes (e.g., Göbel et al. 2019), and science learning (e.g., Pitt and Schultz 2018). Guidance on measuring learning outcomes in citizen science exists (Phillips et al. 2018), and some studies have examined learning outcomes for adults (Kloetzer et al. 2021; Peter et al. 2021a and b). However, there is an emerging but still limited understanding of what and how young people learn through citizen science and how project design can influence these outcomes (Druschke and Seltzer 2012; Roche et al. 2020). Recent examination of patterns of young people in citizen science identified a drop-off between participation in data collection and participation in data submission to the research project (Lorke et al. 2021); however, this data sharing/submission step has been shown to support young people’s development of Environmental Science Agency (ESA) learning outcomes (Ballard, Dixon, and Harris 2017). The ESA framework (Table 1) encompasses a holistic learning process that citizen science may foster, as observed in formal education settings (Bird et al. 2020; Harris and Ballard 2020).
This study investigates the environmental science learning outcomes of young participants (10–15 years old) in the Big Seaweed Search (BSS) citizen science program during a period of program redesign. Using a case study design, we examine the impacts on learning outcomes for young people and identify which design features afford or constrain learning to address the following research questions:
The Big Seaweed Search (BSS) is a UK-based citizen science project run in partnership between the Natural History Museum London (the Museum) and the Marine Conservation Society (MCS), a UK-wide charity. It invites members of the public to search a 5 m transect of the shore, photographing, identifying, and recording the presence/absence and abundance of 14 seaweeds that are indicators of environmental change. Participants enter their data using an online form. Seaweed researchers at the Museum verify the data and use them for climate change research. The program also supports participants’ science learning and engagement and the development of marine stewardship activities. There are three typical instructional strategies to support participation: (1) self-directed individual or family, (2) self-directed community group, and (3) MCS-led community group. We examine this third model in this article.
International learning research, led by this study’s co-authors, studied BSS participants and informed an evidence-led redesign of BSS to enhance learning outcomes for young participants (Table 1). After the majority of redesign had been planned, the COVID-19 pandemic began, which required additional changes to comply with social distancing restrictions. These two drivers of change occurred sequentially during the planning process, but the resulting changes were implemented together in summer 2020.
BSS was one of eight citizen science programs studied as part of an international research project (2017–2022) that investigated the science learning processes and outcomes for young people who participated in museum-led citizen science. Design-based research (DBR) is a collaborative approach used in the learning sciences to iteratively introduce design changes and study their impacts to develop more effective learning environments. Drawing upon Sandoval (2014), Bakker (2018), Penuel et al. (2011), and Fishman et al. (2013), the project team conducted a DBR process to create more opportunities for youth learning and foster the development of ESA in each citizen science context.
The research indicated that youth participants in BSS in 2018 and 2019 lacked a clear understanding of (1) the scientific purpose of BSS and (2) the onward processing and research application of the data they collected, and therefore their personal role and stake in the research. The DBR process addressed these two areas and led to the development of design changes that reframed the activity with a focus on science and enhanced opportunities for young people to submit data to the science research team. Reframing was achieved through refocusing the program’s introduction toward: (1) the authentic scientific research context, (2) the authenticity of the biological field research methods young people would use, (3) the project’s individual lead scientist, and (4) the contribution young people’s data would make to the wider scientific research. This was reinforced by the creation of an introductory video (https://youtu.be/SdHqF38dGtY), which featured the lead researcher and a previous youth participant reiterating these messages. Framing the scientific impact of youth contributions on a global scale and/or understanding of local ecosystems supports youth to develop ESA by motivating youth with broad interests and allowing them to draw on their lived experiences and existing knowledge of a place (Harris and Ballard 2018; Phillips et al. 2019). A final wrap-up session was added to the program, at which facilitators would share results and answer questions.
In common with other learning settings, the team was forced to adapt the delivery of BSS in response to social distancing requirements during the COVID-19 pandemic. Pre-pandemic, MCS staff would have led an in-person introduction and training session before youth conducted beach surveys guided by their youth leaders. The introductory session was adapted for online delivery (facilitated through Zoom) attended by staff from the Museum and MCS and youth’s parents, and included the framing video described above. Beach surveys were undertaken in family groups rather than as a youth group. In a post-participation online wrap-up session, facilitators from the Museum and MCS presented the data the youth had collected, described the photo verification process, and answered questions.
Filming of the framing video was also impacted; for example, the lead researcher and youth participant each recorded their segments separately, which were later edited together in a Q&A style. The youth participant segment was filmed by their parent.
We employed a single case study design (Yin 2018) to investigate the learning outcomes of two youth groups recruited to take part in the BSS. We employed pre-post surveys, interviews, and in-the-moment observations of focal youth to explore the impact of the program on science learning during two delivery periods (2019 and 2020), between which the program’s design and delivery were altered. The program (case) had two embedded units of analysis: (1) the 2019 cohort, and (2) the 2020 cohort (Table 2). Groups were recruited via direct email and telephone conversations, which included appraising youth leaders of the broad aims of the BSS and sharing the instruction booklet.
In 2019 all participating youth completed a pre-survey prior to the introductory session to capture their prior knowledge of science, citizen science experience, and their understanding of the purpose of BSS. At this stage they would only understand the purpose if their group leader had introduced it. Post-participation surveys and semi-structured interviews with selected focal youth (FY) were administered after the final beach survey to examine participants’ experience, participation, and learning.
In 2019 learning researchers conducted in-the-moment observations of nine FY conducting beach surveys to capture information about the activities and the ways that youth participate. FY were selected with the group leader to represent the diversity of participants with respect to age, race, and ethnicity, and prior interest in science. Researchers then tracked one FY for up to 20 minutes, during which they captured what the participant was doing and relevant or important conversations with others. When it would not disrupt the activity, researchers asked FY questions such as, “What are you doing?” and “Why are you doing that?”
In 2020 minor adaptations were made to the 2019 pre- and post-participation survey and interview questions to ensure they remained relevant following the changes to the program. Observations were not possible in 2020 due to COVID-19 restrictions.
Table 3 summarizes the data collected for each cohort. This paper focuses on youth who attended an introductory session and completed both pre- and post-participation surveys (n = 28 for 2019 and n = 7 for 2020, except where youth skipped survey questions). These included any youth whose parents had consented for them to be interviewed. To accommodate the difference in sample sizes, analysis included triangulation of the different data.
For analysis, the interview transcriptions and observation files were uploaded to a qualitative data analysis software (Dedoose, Version 8.0.45), and survey responses were entered into Excel spreadsheets. All data sets (Table 3) were coded following principles of Braun and Clarke (2006) to familiarize and analyze qualitative data. The codebook includes both a priori codes (based on the three ESA dimensions of learning) and new codes derived from inductive analysis. Following the inter-rater agreement exercises suggested by Campbell (2013), the analysts conducted a series of coding exercises to maximize their interpretation of the codebook, after which they coded data independently. The focus of the coding analysis was to identify the development of ESA, paying special attention to FY’s understanding of the scientific purpose of the program, and how the shift to virtual delivery may have influenced ESA learning outcomes. To further portray how FY participated and developed science learning outcomes, we created a vignette (see Supplemental Resources) for one FY from each case (Ely et al. 1997). These are based on the coded data sets (Table 3) and notes taken from the original data sources.
In 2019 pre- and post-participation survey free-text responses to this question were grouped into the following broad themes: raise awareness and educate, learn about seaweed, monitor seaweed, and do the activity itself (Figure 1). Pre-participation responses to the same question from youth in 2020 (Figure 2) included much overlap but also included themes of understanding climate change and other environmental issues. Pre-participation, the 2020 cohort had a broader understanding of the potential impact of the research on a real-life problem while the 2019 cohort emphasized the educational goals. The end-to-end consistency of messaging in 2020 was achieved through group leaders effectively communicating the research purpose before the BSS team delivered the introductory session (see vignettes in Supplemental Resources). Post-participation, however, both cohorts focused more on the scientific activity and understood the purpose was monitoring seaweed. Notably, the 2020 cohort articulated why they were monitoring seaweed more than the 2019 cohort in their free-text responses (e.g., “in order to conserve species of animals and fish”).
Two post-participation survey questions probed youth to describe what happened to the seaweed information they collected: “What do you think happens to your photos after you upload them?” (Figure 3) and “How do you think that information will be used?” (Figure 4). Most survey respondents from the 2019 cohort were not aware of how their data would be used, and those who were aware did not articulate their ideas clearly or had only vague notions of what happened to the data. In comparison, responses from the 2020 cohort demonstrated an understanding that the data they uploaded would be used for monitoring seaweed and supporting research (Figure 4).
Post-participation surveys asked, “Of all the things you did during the Big Seaweed Search, did any of the activities feel like you were ‘doing science’?” In 2019 most youth answered “no,” citing reasons including that they were “just looking at seaweed” or “just counting seaweed” and “it was fun.” This suggests they did not perceive these as valid or authentic scientific activities. The rest in this cohort reported that they did feel like they were doing science and used verbs that indicated they understood why they were looking for seaweed (e.g., identifying, documenting, recording different seaweeds) to explain this. By contrast, the majority of the 2020 cohort answered “yes” to the same question (Figure 5).
In post-participation surveys both cohorts were also asked if they felt like they had made a real contribution to research or monitoring. The 2019 cohort were not wholly agreed, with many responding neutrally; in contrast, the 2020 cohort was in stronger agreement that their contribution felt real (Figure 6).
As well as feeling more involved in science and broadening their perception of what science is, some youth reported more enjoyment, confidence, and competence in science at school post-participation in BSS. Youths’ ratings of their confidence in science at school before and after their participation in BSS increased, an aspect that was explored further in interviews with the question “Do you think it’s helped you with your confidence about doing science inside school or outside of school?” (all names are pseudonyms):
Erin (2019 cohort): “It has. Because ever since we started, without realising, I just started really liking it. It’s one of my favourite subjects now although before I didn’t really enjoy it…at school we spend more time indoors rather than outdoors when it comes to science.”
Isla (2019 cohort): “I’m not really that good at science in school anyway, so I just like to be actually a part of helping people…It’s made me feel more involved with science.”
Ernest (2020 cohort): “I wasn’t really into science before the Big Seaweed Search and now I’m really into it. I enjoy my science lessons at school now…It was just some fun, but it was actually helpful at the same time so I think it was a really good combination.”
To summarize our findings and present them in context, the document in Supplemental Resources represents the experiences of one Focal Youth in 2019 and 2020 as vignettes.
This section reflects on what this study found in terms of (1) the impact of the program design on learning outcomes, (2) connections between informal science activities and formal science education, and (3) the scope for online delivery to scale up citizen science programs.
This case study demonstrates that intentional redesign of a program can enhance youth learning outcomes. Despite the core activities (beach surveys) and resources (identification guide and recording form) for both cohorts being very similar, our findings demonstrate a distinct difference between the two cohorts in young people’s understanding of the scientific purpose of their activities, and appreciation of the research context and onward path of their data. Although the 2019 cohort had an in-person introductory session with representatives from the MCS and the Museum and were able to identify different seaweeds, they had a poorer understanding of the research purpose of the program. This finding would suggest that the content, and not the format (in person or online), determines such learning outcomes.
Three design features potentially afforded this outcome: (1) the scientific framing (by staff and the video) in 2020, (2) the “community” or division of labour, and (3) the increased opportunities for youth to actively participate in data submission in 2020—all reporting that they did this with support.
As well as demonstrating an understanding of why seaweed research is important and the wider context of the research they were contributing to, the 2020 cohort also reported feeling like they “made a real contribution to research or monitoring” more than the 2019 cohort. While this research did not explore the concept of scientific contribution, post-participation survey responses from the 2019 cohort indicated that few understood that the monitoring of seaweeds was science. Participants had perceptions of their biological monitoring/recording activity as both “just looking at seaweeds,” which they did not consider to be science, and “identifying seaweed,” which was one of the top reasons cited for feeling like it was science. This implies that practitioners must frame citizen science activities as “science” intentionally because the youth audience may not think of it as science. Youth are likely to have very different ideas of what science is and is not, and what scientists do and do not do, which program messaging should address from the outset.
The 2020 implementation (with beach surveys conducted as family groups) saw youth taking on, or at least seeing firsthand, a wider variety of roles and tasks within the research process (choosing a beach, selecting a transect and marking it out, identifying, photographing, uploading data afterward). In the 2019 cohort, much of that happened “behind closed doors” by group leaders in advance, or afterward, so the young participants experienced a more limited range of roles and tasks, limiting their experiences of the whole research process and the opportunity to develop skills and expertise in different roles.
The 2019 cohort had comparatively poor access to mobile phones, which constrained their opportunity to photograph seaweeds and submit their data at the final stage of the program. This likely affected youth’s understanding of the purpose of the project and the onward path of their data, so our findings illustrate the importance of this type of participation for their understanding of the project’s purpose and authenticity.
Our findings support the notion, however, that participation does not automatically lead to learning (Bonney et al. 2014); our results show that participating in citizen science is not as impactful if participants do not perceive or understand that they are taking part in authentic science research. Practitioners can deliberately design for learning outcomes by developing learning supports, in conjunction with routine evaluation using tools like the DEVISE scale (Phillips et al. 2014), to maximize learning outcomes for participants.
All three design features are consistent with the National Academies of Sciences, Engineering, and Medicine’s recommendations (2018), in which “encouraging social interaction,” “supporting multiple types of participant engagement,” and “building learning supports into projects” are all cited as best practices for enhancing opportunities for learning through citizen science. Further work would be worth undertaking to see if and how these aspects of program facilitation are connected to learning outcomes because those choices—in this case—were taken for pragmatic program delivery reasons.
Participation in scientific research nurtures the development of scientific inquiry and scientific habits of mind among young people and provides opportunities to integrate content knowledge with inquiry-based learning. Such learning outcomes are included in both US and UK school curricula, and Bonney et al. (2014) highlight the potential of citizen science to support this. Our research identified the benefits of combining larger group learning—whole-group intro and wrap-up sessions where youth learn and collaborate with their peers—with small group settings where youth deepen their inquiry learning and scientific habits of mind through taking on greater responsibility for fieldwork planning, data collection, and upload (in this case, in a family setting). While dependent upon the support of family or out-of-school carers, it potentially offers an avenue for combining classroom activities with homework or other out-of-school tasks. These tasks could be designed for small groups, in which youth can take more of a leadership role and develop their identities and expertise in different roles within a scientific context. It also demonstrates the benefits of framing scientific activities (in this case fieldwork, but equally relevant to experiments in the classroom) within their real-world context, connecting young people to authentic science experiences and wider applications and implications of their study. This connection of classroom concepts to real-world contexts has the potential to strengthen the outcomes of both.
The increases in confidence, enjoyment, and sense of involvement in school science seen in this study are subtle but provide evidence of the potential for growing this impact through repeated exposure to similar projects and activities. Likewise, participating in real-world scientific activities broadened young people’s understanding of what science as a field encompasses—beyond the topics covered in the classroom—and challenged stereotypes about what professional scientists “do” as part of their job. It is especially beneficial for young people moving through education to develop an awareness of broader career options and understand that members of the public can participate in science research.
Our research identified a number of benefits of online training delivery (youth being able to virtually “meet” more scientists and other staff), and while there were some disadvantages (e.g., decreased ability to deliver “hands on” aspects of training such as touching seaweeds), on balance the influence of the actual content and framing of the activity had far greater impact on young people’s learning outcomes than the mode of delivery (online or in person). This successful proof of concept creates an opportunity to scale up citizen science projects—an aspiration of many projects that is seldom realized due to limited staff time and travel costs to deliver in-person training. Exploring online training delivery for young people, with a careful focus on ensuring scientific framing and exposure to different roles within science, provides exciting possibilities for the citizen science practitioner community to significantly scale up participation (in both numbers and geographic scope) and direct resources toward traditionally underserved audiences.
This article reported a case study of a UK-based marine citizen science project as it navigated the dual aims of adapting the program (1) for youth environmental science learning and (2) in response to the COVID-19 pandemic. We found that scientific messaging within a citizen science program is crucial, regardless of how the program is delivered. There are also multiple ways in which citizen science programs can provide learning opportunities for young people, beyond the science content, including opportunities to support the development of science identity and agency; these, however, can only be achieved to their full potential through a clear focus on learning in the design and delivery of citizen science programs. In contrasting the two programs, we have learned lessons in how the design of the program both constrained and afforded learning, from which citizen science practitioners can benefit:
The pandemic both limited but enabled us to investigate learning outcomes, but this case study presents a clear direction for future research and practice.
This material is based upon work supported under a collaboration between the National Science Foundation (NSF), Wellcome, and the Economic and Social Research Council (ESRC) via a grant from the NSF (NSF grant no. 1647276) and a grant from Wellcome with ESRC (Wellcome grant no. 206202/Z/17/Z). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the view of NSF, Wellcome, or ESRC.
We obtained parental consent and followed an ethical protocol approved by The Open University Human Research Ethics Committee (reference number: HREC/3003/Herodotou) and by the Institutional Review Board (IRB) at the University of California, Davis (reference number: 624197-13).
The authors would like to thank the participating youth, their group leaders and their families, as well as the LEARN CitSci team members. We would also like to thank the Big Seaweed Search team, especially Prof Juliet Brodie at the Natural History Museum and our collaborators at the Marine Conservation Society who helped facilitate this study.
Jessica Wardlaw is Citizen Science Programme Developer, Ana Benavides-Lahnstein is a Postdoctoral Researcher, Lucy Robinson is Citizen Science Manager, Sasha Pratt-Taweh is Project Coordination Officer, and Victoria Burton is Project Coordinator, all at the Natural History Museum in London. Julia Lorke is a Postdoctoral Researcher at the Natural History Museum in London, formerly at IPN–Leibniz Institute for Science and Mathematics Education. Maryam Ghadiri Khanaposhtani is a Postdoctoral Researcher, and Heidi Ballard is Professor of Environmental Science Education, both at the University of California, Davis.
citation: Wardlaw, J., A. Benavides-Lahnstein, L. Robinson, J. Lorke, S. Pratt-Taweh, M. Ghadiri Khanaposhtani, H. Ballard, and V. Burton. 2022. Citizen science framing and delivery models: Impacts on young people’s environmental science learning. Connected Science Learning 4 (5). https://www.nsta.org/connected-science-learning/connected-science-learning-september-october-2022/citizen-science
Citizen Science Environmental Science Learning Progression Research Informal Education