Research & Teaching
Journal of College Science Teaching—March/April 2022 (Volume 51, Issue 4)
By Gillian M. Puttick, Brian Drayton, and Christina Silva
The quality of undergraduate education has been a matter of concern among policy and research institutions for some time (Association of American Universities [AAU], 2018; National Research Council [NRC], 2012, 2011; Zusman, 1999; Boyer Commission on Educating Undergraduates in the Research University [Boyer Commission], 1998). According to many of the metrics used by institutions of higher education (IHEs)—such as course enrollment, number and completion of STEM majors, and student learning and engagement (National Academies of Sciences, Engineering, and Medicine [NAS], 2018b; Eagan, 2016; Ferrare & Lee, 2014)—STEM education discourages too many undergraduates from remaining in their field of interest and remains predominantly lecture-based (Stains et al., 2018).
Policy documents (e.g., President’s Council of Advisors on Science and Technology [PCAST], 2012; Beichner et al., 2007; Boyer Commission, 1998) draw on the learning sciences to advocate for “evidence-based” methods that will renew undergraduate education. Generally, these include curriculum redesign and pedagogical methods that attend to student engagement and agency. In this article, we examine the education research literature between 1995 and 2017 for insights into innovations being developed in undergraduate STEM education, particularly in the environmental sciences.
It is widely understood that STEM education needs significant transformation at the university level (American Association for the Advancement of Science [AAAS], 2011; Garrison & Vaughan, 2008). Statistics show that 40% of undergraduates change their STEM major within the first 2 years of college (Seymour & Hewitt, 1997; U.S. Department of Education, 2017). This pattern has caused concern (Eagan, 2016; NRC, 2012) and is accompanied by a sense that teaching and learning need to be reclaimed as a central faculty activity (PCAST, 2012; Biggs & Tang, 2011). This may mean structural and institutional change in university practices governing how faculty allocate their effort (Berg & Seeber, 2016); however, a major emphasis of recent recommendations has been to draw on the findings of the learning sciences to effect change.
All the recommended approaches focus on adopting “evidence-based” pedagogies (AAU, 2011; NAS, 2018b). Key research findings include the basic realization that learning is a constructive process (NAS, 2018a) and a socially embedded practice (Waldrop, 2015; Nasir et al., 2006; Popejoy & Asala, 2013; Moore, 1999). Such “active learning” (Handelsman et al., 2007) has been shown to increase students’ persistence in their studies, improve their sense of self-efficacy, and sometimes strengthen their content understanding as well (Sawyer, 2014; Moravec et al., 2010; Glynn & Koballa, 2006; Dawson, 2005).
Active learning and active teaching (Handelsman et al., 2007; Miller et al., 2008; DeHaan, 2005) aim to make education more student centered (Tran et al., 2014; Knight & Wood, 2005), build student engagement, and improve conceptual understanding and sense of efficacy in science practice (NAS, 2015; Corwin et al., 2015; Auchincloss et al., 2014; Freeman et al., 2014; Mulnix, 2013; Haak et al., 2011; Deslauriers et al., 2011; Dancy & Henderson, 2010).
The four pedagogical frameworks described in the following sections have been shown to foster student motivation and understanding in environment-related science in particular.
Studies that emphasize the role of the community (e.g., Woodhouse & Knapp, 2000; Ault, 2008) or sustainability (Gruenewald, 2003; McPhearson & Tidball, 2013) have been found to invoke the sociocultural, political, and economic aspects of science; make the relationship between sustainability and individual behavior explicit (e.g., Schweizer et al., 2013; Mueller-Worster & Abrams, 2005); and help students gain insight about conservation (Ault, 2008).
A growing body of scholarship has examined the social structures and practices of science as they affect nondominant communities (e.g., Snyder et al., 2016; Ferrare & Lee, 2014; Ong et al., 2011). This has led to the development of pedagogies that link the strengths of students from various communities with the practices of science. These pedagogies draw on funds of knowledge, cultures of collaboration, or emphasis on place to build students’ identities as learners and practitioners of science (Barton et al., 2014; Kinzie et al., 2008).
Self-identifying as a science learner or participant can foster self-efficacy. Carlone and Johnson (2007) describe three principal factors that constitute a science identity: (1) competence in understanding science content, (2) performance of relevant scientific practices, and (3) recognition by others in the scientific community. By constructing or enacting self in a meaningful learning setting, a student can build an “identity-in-practice” (Urrieta, 2007; Calabrese Barton & Tan, 2009; Caraballo, 2012).
Furthermore, engagement in meaningful settings can increase science self-efficacy. Self-efficacy (Bandura, 1997) increases persistence in, for example, a single STEM project, in a course, or across transitions from middle school to high school to college. Persistence in turn mediates the formation of a robust science identity (Ballard et al., 2017; Rahm & Moore, 2016; Bang & Medin, 2010) and can also contribute to students’ career aspirations (Britner & Pajares, 2006).
The sense of place (Clayton & Opotow, 2003; Kudryavtsev et al., 2012; Chawla, 2007) has increasingly been seen as an effective pedagogical framing in the environmental sciences. Place-based education uses the psychology of personal involvement and identity (Clayton & Myers, 2009; Louv, 2005) to activate students’ cognitive and affective attention. By combining affective, cognitive, and social factors in an “experiential apprenticeship” (Pruneau & LaPointe, 2002), place-based education increases students’ engagement with and understanding of the environment (Smith, 2013; Orion, 2007; Koballa & Glynn, 2007; Pruneau et al., 2005).
Scholars have recently begun to recommend explicit local-global links between place-based education and anthropogenic impacts on Earth systems. Sobel (2007), for example, argues for a model of environmental education that gradually increases student responsibility to engage in environmental behavior from kindergarten through high school. Kern et al. (2015) drew on place, a critical cornerstone of a Native sense of identity and culture, when designing a climate change education program among Native communities in the midwestern United States.
Our research question was, “What innovations in curriculum design and pedagogy, which are intended to improve engagement among undergraduate students in environmental sciences, are reported in the research literature?”
To address this question, our analysis addressed the following specific questions:
We sought English-language studies that focused on teaching or learning in Earth science, environmental science, or climate change between 1995 and 2017. We conducted keyword searches of 19 journals that we deemed likely as publications that authors would target (see appendix at https://bit.ly/3rmJ75e). Next, a search of the BioOne and EBSCO databases resulted in articles from 16 additional journals.
We initially identified 149 articles that we examined for further inclusion based on three criteria:
The first two criteria resulted in the exclusion of 22 articles. The final criterion resulted in the exclusion of 37 articles. Therefore, our final corpus consisted of 90 studies.
Two researchers coded all articles; inter-rater reliability was above 75%. Where disagreements occurred, coders discussed differences and established an agreed coding. Codes were entered into a FilemakerPro database to facilitate analysis. The search terms used are provided in the appendix (https://bit.ly/3rmJ75e).
To understand the context of the studies, we looked for information about intervention length, location, and instructional level, as well as student demographics.
We were interested in identifying where IHEs were directing their reform efforts. There were more than twice as many studies focused on the introductory level compared to the advanced level (27 vs. 11; Figure 1). Almost equal attention was paid to courses for majors compared to those for non-majors. Nearly one-third of the studies (28) did not specify the level; we coded these as “undergraduate.” We considered the possibility that reform efforts might differ for large introductory classes compared to advanced courses. However, this analysis was not possible since authors did not consistently describe introductory courses as such. In addition, the sample size was too small to detect any patterns in the data.
More than half of the studies (49) involved a semester-long intervention (Figure 2). Twelve of these studies focused on the design of a new course, while 37 focused on integrating some innovation or reform across existing courses. Of the latter, 19 described the integration of new instructional elements or implementation of new pedagogies. From the rest of the studies, 13 reported short interventions that ranged from between one and three class sessions, while 22 were conducted over four or five sessions. Examples of short interventions included debates, stand-alone labs, or short service-learning projects.
Almost two thirds of the studies (57) attempted to move beyond lecture-based instruction (Stains et al., 2018) by adding field, lab, or computer lab activities to classroom-based instruction (Table 1). Fourteen studies involved outdoor field trips, nine involved lab activities, nine involved lab and field activities, and four included the computer lab. Seven studies included visits to other sites, such as museums or campus energy installations. The 19 classroom-based studies often involved homework, followed by “active learning” in the classroom, such as a homework reading assignment followed by small-group class discussion. Two examples were a case study to consider range management trade-offs in response to climate change (Balgopal et al., 2014) and student creation of Venn diagrams weekly to “reinforce course content and support independent thinking” (Berkman, 2006, p. 383) in a course on Antarctic science and policy, followed by weekly discussions in class.
Only one quarter (25) of the studies reported gender ratio, while four noted that students were from urban areas and one noted that students were from suburban areas. None of the studies reported the presence of students with learning disabilities. Only eight studies (7%) described student racial or ethnic demographics.
When addressing curriculum and pedagogy, we first conjectured that there would be a shift toward more active learning over time in the corpus in response to calls for active learning. Therefore, despite the relatively small size of the data set, we divided the studies into 5-year subsets to look for trends in variables such as types of technology used; implementation of inquiry-based methods, discussion, or open exploration; or a focus on the process of science. No trends emerged from the analysis.
Teaching new concepts was the instructional purpose of the large majority (79) of the studies, and two thirds (67) were also studying a novel teaching approach (Figure 3). Just fewer than two thirds (56) of the studies focused on teaching analytic techniques (e.g., looking for patterns in data or analyzing outputs of quantitative models), while one quarter (23) involved teaching manipulative techniques such as using tools (e.g., probes, GIS) and procedures (e.g., running assays). Sixteen studies explicitly addressed misconceptions. Few studies (14) described their instructional purpose as engaging students in open-ended exploration.
Some authors explicitly described their purpose as teaching about the process of science (30 studies), the nature of science (7 studies), or both (14 studies; see Figure 4).
Overall, 64 of the studies focused on global warming. To teach the basics about causal mechanisms, student activities focused on relationships among factors (e.g., greenhouse gas emissions, the greenhouse effect, Earth’s energy balance) or on differentiating between weather and climate. Thirteen studies drew on historical data, such as identifying evidence of climate change from ice core data (Figure 5).
A subset of these studies focused on climate change impacts as well. For example, students engaged with biotic impacts (i.e., the ecological drivers of change) in 20 studies. There were 5 studies focused on changes in phenology, 6 on migration patterns, and 13 on impacts on biodiversity. A larger number in the subset focused on abiotic impacts, such as polar ice-melt (20 studies) and sea level rise (19 studies).
Together with the 20 studies mentioned in the previous section that focused on biotic impacts, an additional 27 studies related to some other aspect of the biosphere. These 47 studies related to two broad areas—the role of the biosphere in relation to carbon cycling, or population studies (Table 2). Five of the latter drew on historical data related to the biosphere to infer past climate—for example, paleoclimatology involving sedimentary pollen samples.
Since the impacts of climate disruption increasingly manifest in locally specific ways, we coded the studies for connections across spatial scales. Fifteen studies focused on local phenomena only, while the remaining studies included some combination of local, regional, and/or global scales (Table 3).
To understand the pedagogical framing used in the studies, we coded for the type of student activity described, the materials with which students engaged, and the degree of inquiry described.
Although only 14 studies explicitly described their instructional purpose as engaging students in open-ended exploration (see Figure 3), we found that 35 of the studies in the corpus included student activities that could be coded as such. Such explorations were typically undertaken to familiarize students with a new phenomenon or system (as distinct from observation for the purposes of data-gathering; Figure 6). For example, Ellwein’s students explored long-term data sets on daily weather events and seasonal sightings of mammals and birds to familiarize themselves with the data structure before looking for patterns (Ellwein et al., 2014).
There were 62 studies that involved structured investigations in which students made observations to gather data, such as measuring air temperature to study urban heat islands (Thiessen, 2008) or analyzing data to look for patterns (73 studies; e.g., Clayton & Gautier, 2006; Comrie, 2000). Students participated in discussion in 73 studies, typically to agree on or make sense of investigation outcomes (e.g., Banschbach & Letovsky, 2010; Koretsky et al., 2012), to make sense of readings (e.g., Clary & Wanderzee, 2012), to engage in scientific argumentation (e.g., Gautier, 2012; Wyner, 2013), or to arrive at common learning outcomes (e.g., Dunnivant et al., 2000). Only five of the studies included true experimentation in which students designed and conducted controlled studies (“fair tests”).
In 38 studies, the instructor provided a model that students evaluated, tested, or verified, while in 16 studies, students built or revised qualitative or quantitative models they designed or constructed themselves. In approximately half of these 54 studies, students worked with qualitative and quantitative data they collected themselves, while the instructor provided data in the other half.
Not surprisingly, instructors continue to provide print materials (49 studies), but almost as many (47 studies) also included multimedia, or computer-based instructional visualizations or modeling software (36 studies; Figure 7). Aligned with the focus on local and regional phenomena (see Table 3), 25 studies engaged students with field systems (e.g., surveying land cover or measuring biodiversity). Students worked with biological material in the lab in only five studies.
To characterize inquiry, we asked whether students or instructors, on a Likert scale of 1 (student) to 5 (instructor), (i) formulated an investigation question, (ii) chose methods for the investigation, and (iii) determined what sense could or should be made of lab outcomes (“knowledge construction”; Puttick et al., 2015; Chinn & Malhotra, 2002).
Student-directed inquiry was uncommon, with students in only two studies fully responsible for all three inquiry processes (Table 4). Fewer than one quarter (18) of the studies indicated that students had some degree of responsibility for an inquiry question, while instructors determined the investigation question in two thirds of the studies (63). Fifty-two studies indicated that both students and instructors contributed to the design of an investigation. More than two thirds of the studies (72) indicated that instructors determined the learning outcomes, but students took some responsibility for constructing this knowledge. Just two studies clearly prescribed what the right answer should be at the conclusion of an “investigation.”
Six of 18 place-based studies focused on the local impacts of climate change, while five connected local to regional and global impacts. Three studies focused on some aspect of biological species loss and mitigation, while one focused specifically on the social impacts of climate change. Finally, three did not focus on climate change, but instead focused on the water cycle, biodiversity, and natural history and biological conservation, respectively.
The 23 studies that focused on sustainability that were coded for “energy consumption” used this topic as the “hook” for teaching about some aspect of climate change (e.g., emissions, impacts on human society, mitigation, or adaptation). In 11 studies coded for “carbon footprint,” students tracked their own activities and calculated their carbon output online. Similarly, instructors in these studies used the activity to teach about climate change.
The 36 studies related to human impacts and actions with respect to climate change (Figure 8) included some combination of mitigation, adaptation, and social impacts.
Given the emphasis on student learning as an instructional purpose (Figure 3), student knowledge was, not surprisingly, the most common outcome reported (74 studies; Figure 9). Scientific reasoning was reported in more than half (51) of the studies, while 69 also focused on affective outcomes, reporting on student interest, curiosity, or motivation. In addition, 26 reported changes in students’ attitudes toward science, and 14 noted changes in students’ attitudes toward the environment. Given studies’ prominent emphasis on trying “novel teaching approaches” (see Figure 3), it was surprising that only 26 reported about students’ perceptions of their learning environment.
We were struck by the large number of studies (49) that reported revamping semester-long courses or designing new courses entirely (see Figure 2). Since active learning is intended to build student engagement, we were interested in the extent to which the studies paid attention to this dimension. Students’ confidence in science and students’ interest and curiosity were the outcomes reported most frequently (Figure 10), while agency and self-efficacy and persistence and motivation were each reported in fewer than 10 studies. Career interest and engagement were only reported in three studies each.
Student affective outcomes reported in semester-long courses (n = 49).
Note. Eleven studies did not report affective outcomes.
In general, our evidence suggests that IHE instructors are attempting to move beyond didactic practices by incorporating new pedagogical methods into existing courses they teach or by revamping whole courses at the departmental level. However, the extent to which study authors explicitly link engagement and learning outcomes is mixed. While authors express the goal of adopting active-learning pedagogies, their theories of change are mostly unexamined.
Many authors mention inquiry as a general stance toward instruction in their study introductions, but there is little evidence that inquiry pedagogies are used. The instructor sets the investigation question in the large majority of cases (see Table 4). Students participate in determining investigation methods and in “sense-making” in a higher proportion of studies—but the instructor still co-determines these facets of inquiry. Moreover, the frequency of both “fair test” designs and “student-constructed models” is extremely low (see Figure 6). Therefore, it is reasonable to conjecture that student freedom to inquire—to make mistakes in investigations and to debug them—is largely constrained.
Relatedly, although about one third of studies describe teaching process of science as a goal, the evidence suggests that such processes are encountered in targeted and constrained contexts, such as in learning specific manipulative or analytic techniques (see Figure 3).
Moreover, although almost half the studies engage students in activities that we could code as open exploration, the actual curricular role of activities such as familiarizing students with phenomena was not clear. The activities presumably provide value in engaging students with the phenomena, but students did not move on to asking questions or collecting and analyzing data in a productive way.
However, there is promising evidence on two other counts: It is striking to note the extent to which student discussion is reported (81% of studies) and the high proportion of cases (63%) that included field work, laboratory activities, or computer modeling (see Figures 6 and 7). These data are consistent with a shift from lecture-centered instruction to a more multimodal approach in which more scope is provided for students’ intellectual and affective engagement.
The overwhelming majority of studies examined the introduction of new concepts or new teaching approaches. However, there are interesting questions that we cannot answer. For example, are there cases in which the introduction of new content is best served by changes in pedagogy as well? This might be the case when, as in about 30% of the studies, an important aim of the lesson was teaching science practices. Unfortunately, the research findings do not let us explore this avenue in any depth, as most studies were not designed to examine relationships between co-occurring innovations.
There was more focus on abiotic impacts (e.g., Earth’s energy balance, sea-level rise) than biotic ones. This may reflect the fact that the largest proportion of the studies were situated in geoscience and Earth science, but it is noteworthy that relatively few of the studies also addressed ecological or biological impacts, which potentially have more significance for humans. How might this lack of emphasis affect student engagement in introductory courses or the engagement of students interested in issues of environmental policy? An answer would require authors to report on the context (e.g., other content in the course or course progressions) within which the reported innovations take place.
Our data indicate that while authors generally frame their innovations in terms of addressing student engagement, the engagement measures they used were very “qualitative,” even anecdotal, and sometimes just impressions. Few authors used validated instruments, such as the attitude surveys and observational rubrics developed and tested for K–12 education. The generalizability of findings might be improved by collaborating with education researchers to design studies and research instruments.
We conclude by pointing out three gaps that we found surprising, but that have potential for future research. First, relatively few authors reported results relating directly to their impact on specific populations of interest or even provided information about student demographics at all. This finding aligns with those reported in a synthesis of the literature on race and ethnicity in science education (Parsons, 2014), which emphasizes the “invisibility” of race and ethnicity in science education scholarship.
Innovations will be better evaluated and understood when researchers focus on these dimensions. We recommend that our fellow researchers in the environmental sciences describe their student demographics and disaggregate their findings based on race to consider how their participants’ racial identities, as well as the persistence strategies they utilize, contribute to their interpretation of the study’s results. These would be small steps toward addressing the pressing issue that scholarship “should not only generate knowledge but should also transform science education to become more equitably and socially just” (Parsons, 2014, p. 182).
Second, and of particular importance for future progress in this field, we observed that few negative results are reported. Negative outcomes were mentioned in 14 of the 90 studies, all of which reported positive outcomes in greater depth. This is not uncommon in the research literature, but it is not desirable. As one of us has noted elsewhere, “While it is central to scientific progress to publish the results of successful attempts at establishment, it is also important for a field to recognize the frontiers of ignorance and failure, even in relatively mature areas of investigation (Kotze et al., 2004, cited in Drayton & Primack, 2012, p. 299).
Third, in general, authors are not yet taking advantage of the extensive research methodologies in K–12 science education developed either for studying affective outcomes such as engagement and motivation or for understanding the nature and processes of science. By collaborating with educational researchers, IHE faculty could benefit in their research from good scholarship in science education research and, in the process, strengthen their own pedagogical repertoires.
Our study’s limitation included that it
Examining the state of our understanding about what works, for whom, and under what conditions is needed if STEM education in institutions of higher education is to be transformed rapidly and thoughtfully. Despite its limitations, this study has made an early contribution to address this need. We believe that future research efforts will achieve greater impact if they attend to the gaps identified, such as by specifying student demographics, reporting positive and negative student outcomes, and considering cross-discipline collaborations between science faculty and educational researchers.
Gillian Puttick (gilly_puttick@terc.edu) is a senior scientist and Brian Drayton is co-director in the Center for School Reform, and Christina Silva is a research associate in the Education Research Collaborative, all at TERC in Cambridge, Massachusetts.
Climate Change Environmental Science Preservice Science Education Professional Learning Teaching Strategies Postsecondary