feature
A Multi-Class Evaluation
Journal of College Science Teaching—July/August 2020 (Volume 49, Issue 6)
By Gary D. Grossman and Troy N. Simon
Active learning is a pedagogical method focusing on student discovery and synthesis. Although active learning comprises a variety of techniques, all require use of higher-order learning processes detailed in Bloom’s taxonomy of learning (McKeachie et al., 1987; Krathwall, 2002), including: (1) formulating original hypotheses, (2) developing a structure for hypothesis testing, (3) testing the hypotheses, and (4) synthesizing from test results. Classroom exercises that follow this specific pattern may be considered “inquiry-based,” (Wilke & Straits, 2005; Lord & Orkwiszewski, 2006; Gormally et al., 2009; Levy & Petrulis, 2012) and are particularly appropriate for organismal biology, ecology, evolution, and natural resource management (OBEENR) courses where they provide examples of authentic learning (i.e., occupational tasks performed by professionals) (Bromham & Oprandi, 2006; Quardokus et al., 2012; Krämer et al., 2015; Grossman & Simon, 2018).
There is much evidence that active learning has a positive impact on a student’s perceptions of a class, as well as on learning and synthetic thought (Minner et al., 2010; Freeman et al., 2014; Grossman & Simon, 2018; Styers et al., 2018), although there are few evaluations of active learning exercises in life science classes (for exceptions see Habron & Dann, 2002; Campo & Garcia-Vazquez, 2008; Derting & Ebert-May, 2010; Prins et al., 2011; Cleveland et al., 2017), especially for different classes or class formats (seminar versus lecture; Drinkwater et al., 2017) or at multiple academic levels (first-year university versus fourth-year versus graduate; Beck et al., 2014; Grossman & Simon, 2018). Although active learning exercises exist for a number of Science, Technology, Engineering, Math (STEM) disciplines, they appear to be much more frequent in physics/chemistry/engineering classes, than in OBEENR courses (Lai, 2018; Grossman & Simon, 2018). In addition, active learning exercises in STEM are also much more common at the secondary rather than the university level, and for students in advanced courses within a STEM major rather than for general education STEM classes (Weasel & Finkel, 2016; Cleveland et al., 2017; Wiggins et al., 2017; Grossman & Simon, 2018). This is unfortunate, because there is substantial evidence that students are avoiding STEM fields and careers both nationally (Olson & Riordan, 2012) and internationally (Han, 2017).
One of the main problems inhibiting the development of many authentic, active learning exercises for OBEENR courses is the frequent need for live study subjects, especially animals. The use of live animals in university courses typically requires substantial husbandry resources, as well as approval by an Institutional Animal Care and Use Committee (IACUC). However, such constraints may be obviated via the use of video footage of animals behaving naturally in their typical (or atypical) habitats. Such video footage is becoming more commonly available on internet-based free platforms such as YouTube, and represents Open Educational Resources (OERs), useful for the development of authentic, active learning exercises for university and graduate students (Parisky & Boulay, 2013; Grossman & Chernoff, 2018).
In this paper we describe students’ attitudinal responses to several authentic, active learning exercises in two OBEENR courses, both designed for nonscience majors. Active learning exercises for STEM courses designed for nonscience majors are even less common than those developed for science major classes (i.e., third and fourth year OBEENR courses with accompanying laboratories; Wiggens et al., 2017; Grossman & Simon, 2018), which is unfortunate, given that nonmajor courses may act to attract new students to the sciences. Using the design of Grossman & Simon (2018), we quantified the characteristics of students within and among classes, as well as their attitudinal responses to an authentic, active learning exercise in two annual sections of a nonscience major “The Natural History of Georgia” course (FANR Fall 2015, 2016), and three first-year seminar (FYO Spring 2015, 2016, Fall 2016) classes, “The Natural Environment of Athens and Georgia,” (henceforth FANR 2015, 2016, and FYO 2015, FYO Spring 2016, FYO Fall 2016). We used Likert-scale questionnaires and structured triangulation interviews to quantify the positive and negative responses of students to the various aspects of the exercise. In addition, we also determined whether the distribution of students among undergraduate majors differed within classes and whether year in school influenced student responses (FANR only). Finally, we recorded students’ preferences for various parts of the exercise. This study was conducted under the approval of the Institutional Review Board of the University of Georgia.
We used five OERs as the basis for the active learning exercises; two of feeding birds, and three of fishes behaving naturally in a stream environment. The first two were 15- and 22-minute videos of undisturbed birds feeding at a bird feeder in Athens, Georgia, (garydg29, 2017a and b), and the remaining three were: (1) a 10-minute video of a group of juvenile Chinook salmon interacting in the Chena River, Alaska (naturalhistorygeorgia, 2015), (2) a 15-minute video of a group of Arctic grayling (Thymallus arcticus) in the Richardson Clearwater River in central Alaska (naturalhistorygeorgia, 2016), and (3) a six-minute video depicting two male gilt darters battling over a potential reproductive territory in a North Carolina stream (Crail, 2010). All videos exemplified natural behavior of animals in situ. An example rubric used for the gilt darter video (most common video used, FANR 2016, FYO Spring and Fall 2016) gave students guidance on what was expected in the exercise, and included a grading rubric (Table 1).
Table 1. Active learning rubric for Natural History of Georgia, 2016. | ||||||
---|---|---|---|---|---|---|
|
The learning objectives for the exercises were based on activities described in the rubric (Table 1) including: (1) perform quantitative and qualitative observations on biological subjects, (2) generate scientific hypotheses from observations, (3) perform exploratory data analysis, (4) evaluate hypotheses, (5) reach conclusions based on finding, and (6) represent findings in writing (a paper with 2–10 pages of text, not including tables, figures, or references). These objectives were satisfied by having students: (1) observe and classify behaviors performed by the fish, including a description of how the behavior both started and ended, (2) tabulation of the frequencies that specific behaviors were performed, (3) identification of behavioral sequences (where three of more behaviors were performed in a sequence), (4) simple data presentation and analysis of data, and (5) synthesis via discussion of what was observed and its adaptive significance.
In the first trial of the exercise in a large lecture class (FANR 2015), students were provided with a rubric that did not include possible behaviors to observe. However, given that the majority of students had little familiarity with biological research, there was some confusion regarding what constituted a “behavior” and other aspects of the exercise (e.g., how data should be presented and analyzed); consequently, a more detailed rubric was developed that contained a list and descriptions of specific behaviors that might be observed and grading expectations (Table 1). This general rubric was used in FANR 2016; and FYO Spring and Fall 2016.
We assessed student responses to the exercises via Likert-scale questionnaires and triangulation interviews. Student questionnaires ranged from seven to 12 questions per class and encompassed questions regarding their attitude, comprehension, and preferences regarding the assignment. The Likert-scale for questionnaires consisted of four potential answers: (1) completely true, (2) somewhat true, (3) completely false, and (4) somewhat false. For ease of analysis and interpretation, we classified both “true” responses as positive and both “false” responses as negative, unless the question was phrased in a negative manner in which case the classifications were reversed. For example, responses to the question “The active learning assignment was a waste of my time” were interpreted as positive if students answered either completely or somewhat false, whereas they were considered negative if students responded with a completely or somewhat true response. We also gathered demographic/attitudinal information on students: (1) major or prospective major (life sciences including agriculture, business/political science, journalism, education, family/consumer science), (2) year in university (1st–2nd versus 3rd–5th), (3) previous experience with science classes (experience levels were: advanced [AP classes], good [multiple science classes], average [one science class] and weak [no science classes]), and (4) whether they had a favorite part of the exercise (observing animals in nature, collecting and analyzing data, researching background material, report writing, and no response). Statistical analysis involved simple frequentist statistics for differences in frequencies (i.e., Χ2 or Fisher exact tests) and one tailed t-tests for two category tests (more positive than negative responses). In all cases the null hypothesis was one of equivalence, and analyses were conducted on data from each class separately.
After administration of the active learning exercise and Likert questionnaires in a class, we asked volunteers to participate in a short triangulation interview via telephone. These interviews were conducted by laboratory technicians rather than by Professor Grossman to reduce potential bias, and consisted of five questions that varied slightly depending on the video used. The interview questions varied slightly among classes (i.e., the specific video used in a class was mentioned) and included: (1) What was your favorite part of the video assignment?, (2) What was your least favorite part of the video assignment?, (3) What was the most important thing you learned from the video assignment?, (4) How could the video assignment be improved?, and (5) Any other comments? Representative comments from triangulation interviews are presented for each class, as well as a simplified trend analysis based on these comments. The trend analysis involved scoring comments using the method of Grossman and Simon (2018) for the presence of statements indicating that the exercise was: (1) enjoyable, (2) new (i.e., a type of assignment they had not previously encountered in course work), or (3) a form of deeper learning (see Online Supplemental Materials). We then calculated the percentage of interviewees from each class who responded with these comments, as well as the number that responded more than once with one of these comments.
The five classes were similar in student composition, and in four of five classes there were no differences in the frequency of students in different majors (Table 2): FANR 2016: Χ2 = 7.06, df = 5, p = 0.216; FYO Spring 2015: Χ 2 = 11.00, df = 5, p = 0.051; FYO Spring 2016: Χ 2 = 1.49, df = 4, p = 0.828; FYO Fall 2016: Χ 2 = 4.22, df = 4, p = 0.38, although this class characteristic differed significantly in FANR 2015: Χ2 = 11.40, df = 5, p = 0.044, with far more business/engineering majors observed (n = 17) than expected (n = 7). FYO Spring 2015 also was significant at the 0.051 level, with the class composed entirely of life science and business/engineering majors. We obtained identical results for most classes when these categories were collapsed into science majors versus nonscience majors: FANR 2015: Χ = 17.63, df = 1, p-value < 0.001; FANR 2016: Χ2 = 3.20, df = 1, p = 0.073; FYO Spring 2015: Χ2 = 0.04, df = 1, p = 0.831; FYO Spring 2016: Χ2 = 0, df = 1, p = 1.0; FYO Fall 2016: Χ2 = 0.58, df = 1, p = 0.445. Only one class (FANR 2015) had students that reported differing levels of experience in science courses. However, we did not observe a significant difference in the number of positive responses to the questionnaire by students with these differing levels of experience (t = -0.485, df = 29, p = 0.631).
Table 2. Student’s major or prospective majors. | |||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Students responded very positively to the video active learning exercise, and in four of five classes there were significantly more positive than negative responses to questions on the exercise (Table 3); FANR 2016: t = 4.06, df = 23, p << 0.001, FYO Spring 2015: t = 17.75, df = 10, p << 0.001, FYO Spring 2016: t = 4.66, df = 13, p << 0.001, and FYO Fall 2016: t = 2.53, df = 13, p < 0.050. The sole exception was FANR 2015, which was the first lecture class in which the exercise was used (FANR 2015: t = 1.07, df = 39, p = 0.291). Nonetheless, a majority of students (55%) in FANR 2015 indicated that they felt having a chance to do scientific research in class was interesting. Year in school had a slight significant effect on the number of positive responses by students in the two classes with enrollees from different levels (FANR 2015; 2016), although these results were the opposite of one another (FANR 2015, years 1–2 had significantly more positive answers than years 3+ [years 1–2 mean (SE) = 5.12 (0.69); years 3+ mean (SE) = 3.31 (0.56), t = 2.044, df = 38, p = 0.048)] versus FANR 2016, (years 1–2 had significantly fewer positive answers than years 3+ [years 1–2 mean (SE) = 6.53 (0.71), 3+ mean (SE) = 8.71 (0.71)] t = -2.17, df = 17, p = 0.045).
Table 3. Student perceptions of the OER-based active learning video assignment. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Students in four of five classes did not display a preference for different aspects of the exercise (Table 4): (1) observing animals in nature, (2) collecting and analyzing data, (3) researching background material, (4) report writing, and (5) no response. FANR 2015: Χ2 = 2.578, df = 3, p-value = 0.461; FYO Spring 2015: Χ2 = 6.294, df = 4, p = 0.178; FYO Spring 2016: Χ2 = 5.466, df = 4, p = 0.243; FYO Fall 2016: Χ2 = 5.236, df = 3, p = 0.155. Students in FANR 2016 did, however, show a preference for specific aspects of the project: Χ = 10.279, df = 4, p = 0.036, with far more students preferring “observing animals in nature” (13 observed versus 4.8 expected) and far fewer students preferring “report writing” (none observed versus 4.8 expected).
Table 4. Students’ preferences for specific aspects of the active learning exercise. | |||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Student responses to the five triangulation interview questions indicated substantial satisfaction and interest in the active learning exercises (see ). Due to an oversight, we neglected to conduct triangulation interviews in the first class the exercise was used in (a seminar class, FYO Spring 2015), however, there were multiple comments on the need for more direction in both the rubric and from the professor and from students in the next class, a lecture class (FANR 2015; (see Online Supplemental Materials). Consequently, we added substantial detail to the rubric (i.e., potential behaviors observed, grading rubric, linking observations to natural selection (Table 1). Scoring answers from triangulation interviews substantiated that a majority of students found the active learning exercise to be: new and represent deeper learning (Table 5), although a majority felt it was enjoyable only in FYO spring 2016. The high percentage (41–67%, Table 5) of students that found the exercise new or representing deeper learning demonstrates the value of such exercises in undergraduate OBEENR classes. Finally, the percentage of students making more than one comment of enjoyable, new, or deeper learning ranged from 47–100%, again indicating the exercise was highly satisfactory.
Table 5. Trend analysis from triangulation interviews. | ||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Active learning exercises may result in greater retention and learning in university STEM classes (Freeman et al., 2014; Styers et al., 2018), yet there are few specific exercises developed for university-level OBEENR classes for either majors or nonmajors, and even fewer evaluated over multiple courses and student levels. We have shown that student responses to video OER-based, authentic (i.e., observing animals interacting with conspecifics and their environment), active learning exercises were strongly positive in four of five classes representing multiple academic levels, and provided new and deeper learning experiences for most students (i.e., enhancing creativity, formulating hypotheses, collecting and analyzing data, synthesizing information, evaluating hypotheses and conclusions). These conclusions are supported by both triangulation interview and Likert-scaled questionnaire responses in which a majority of students in three of four classes answered positively to questions involving creativity and synthesis. Even students in FANR 2015, the one class where students failed to report significantly more positive than negative answers to the questionnaire, reported many positive experiences from the exercise in triangulation interviews, commenting on both its newness and deeper learning. In addition, 55% of students in FANR 2015 answered positively to the question “Having a chance to do scientific research was interesting” and 51% responded positively to the question “The skills I learned in completing the active learning project are useful.” Consequently, these exercises should provide very positive learning experiences for students in similar classes, and may also serve to attract new students to STEM majors. It also is clear that the exercises represented “authentic” learning, given that a majority of students in all five classes responded that the skills learned in the assignment were “useful.” Finally, although our data only show a strong positive attitudinal response to the active learning exercises, Han (2017) has shown such responses elicit increased student interest in pursuing STEM education at the university level.
Our data elucidated several additional aspects of student responses. Although the exercise was tested in five classes, the distribution of students among majors did not differ within (hence among) classes, with two slight exceptions (FANR 2015, p = 0.044; and FYO Spring 2015, p = 0.051), indicating a reasonable level of homogeneity within the test populations. In addition, demonstrating a strong positive student response to an exercise in classes with distinct student populations supports the contention that the exercise has broad utility and appeal. In general, students had no preference for different aspects of the exercise, and students with differing levels of previous scientific experience also did not respond differentially to the questionnaire. Not surprisingly, we did observe several contradictory responses from students in questionnaire responses. Seventy-one percent of students in FYO Fall 2016 felt the assignment was too difficult, yet most responses to other questions were strongly positive and 82% responded that the exercise was not a waste of their time. The effect of year in school on the frequency of positive questionnaire responses also displayed contradictory responses, with first- and second-year students in FANR 2015 showing significantly more positive responses to the questionnaire than third-year (plus) students, whereas the opposite was true for FANR 2016. We originally hypothesized that more experienced students would show a higher frequency of positive responses to an authentic, active learning exercise, given their greater familiarity with university-level course work, especially writing research papers, but such was not the case. Of course our sample size (n = 2) was low.
There is some controversy over whether the positive student responses to active learning exercises are a function of instructor quality (i.e., instructors interested in innovative pedagogical techniques likely also are better instructors as a whole; Andrews et al., 2011), rather than responses to the exercises themselves, and it is clear that multiple factors including instructor training likely affect student outcomes in active learning situations (Andrews & Lemons, 2015; Cleveland et al., 2017; Tomkins et al., 2019). In addition, faculty that use active learning exercises at the university level do not fall into a single mold of instructional approaches (Smith et al., 2014). In contrast to Andrews et al. (2011), Cleveland et al. (2017) found that instructor training/experience did not affect student outcomes in a comparison of two instructors’ performance in an introductory cellular and molecular biology course. Nonetheless, Andrews and Lemon (2015) demonstrated that instructor experience and collegial support affected instructor performance when employing active learning case study exercises. Tomkin et al. (2019) also found peer support positively impacted instructor performance in the successful application of active learning techniques. In our study, only one instructor (GDG) taught all five courses; hence, it is impossible to separate the impact of the instructor from the effect of the exercise itself. In conclusion, these findings all suggest that more work is needed before the effects of class composition and level, instructor ability and training, personal beliefs, interpersonal skills, and work-place support on active learning outcomes may be disentangled.
Given that the active learning exercises were designed for nonSTEM major courses, three issues arise. First, given that the exercises were designed to represent “authentic” learning experiences (i.e., those that a professional biologist would perform during their career), they should serve to pique, and hopefully increase, nonSTEM majors’ interest in biological or natural resource management careers. Second, although there were several biology and natural resource management students in both FANR and FYO courses (GDG personal observation), it is unclear how majors in these fields will respond to the exercises. The fact that the exercises represent authentic learning should increase the probability that they will be viewed positively by biology, ecology, and natural resource management majors in future coursework. Finally, we considered student comments that the exercises were “new” to be positive responses and a high percentage of students volunteering for triangulation interviews responded in this manner (i.e., 60–100%). Certainly, some of these responses were a consequence of students’ lack of experience with the scientific process, and it is possible that these percentages would decrease in courses specifically designed for biological science and natural resource management majors.
An additional advantage of the active learning exercises is that they are based on video OERs, and hence do not require additional training and logistical support that would be necessary if live animals were used. Grossman and Chernoff (2018) have argued that video OERs provide excellent potential resources for active or passive learning exercises in OBEENR classes. Such resources may become especially important in the near future, as species decline due to external factors such as anthropogenic habitat degradation and climate change. Video OERs have not been commonly used in OBEENR classes, but see Pal and Panigrahi (2013) and Scanlon (2012) for earlier examples.
Our goal for the active learning exercises was to stimulate scientific thinking and performance of a scientific study/exercise by nonscience majors; consequently, little emphasis was placed on biological reality or scientific accuracy of the conclusions. Instead, we focused on eliciting higher-order learning behaviors from students, as per Bloom’s Pyramid (Krathwohl, 2002). Some students, especially those with little experience in science or active learning, were uncomfortable with the assignment (GDG personal observation), and this was represented both by the number of students who indicated they would rather have an exam than the active learning assignment, as well as several comments in triangulation interviews. It is difficult to determine whether this was just a reaction to an unfamiliar, and admittedly difficult task, or a real preference for passive versus active learning evaluative instruments.
The development of more effective active learning exercises should act to improve student learning outcomes in both STEM and OBEENR. The use of video OERs should facilitate development of new exercises at a time when natural habitats are increasingly affected by anthropogenic change. It is our hope that the exercises proposed and evaluated in this paper will aid in the development of active learning exercises in the sciences.
Gary D. Grossman (grossman@uga.edu) is a professor of animal ecology in the D. B Warnell School of Forestry and Natural Resources at the University of Georgia in Athens, Georgia, and a fellow of the American Fisheries Society and the Linnean Society of London. Troy N. Simon is a research professional in the D. B Warnell School of Forestry and Natural Resources at the University of Georgia in Athens, Georgia.
Biology Instructional Materials Teacher Preparation Postsecondary