A guest post by Cindy Hoisington (choisington@edc.org), an early childhood science educator and researcher at Education Development Center Inc. in Waltham MA; Regan Vidiksis, a researcher at Education Development Center with a focus on STEM teaching and learning in early education environments; and Sarah Nixon Gerard, an education researcher at SRI Education with a focus on early learning. Welcome Cindy, Regan, and Sarah!
Chances are, since you are reading this blog, you know how important early science experiences are for children’s future learning and achievement. You know they promote children’s critical thinking, collaboration, communication, and creativity; nurture children’s interests in science; and fuel their developing science identities (Center for Childhood Creativity, 2018). But are you sometimes challenged to figure out what children, especially children between the ages of 3 and 8, are actually gaining from the science experiences you provide in your early childhood (EC) setting? Assessing young children’s learning in science can be a complex process, especially in physical science—a domain that many teachers are less familiar with than life science.
The Cat in the Hat Study
As part of a team of Early Childhood researchers at Education Development Center (EDC) and SRI Education, we had the opportunity to take a deep dive intoEC science assessment during a study of physical science and engineering resources associated with the PBS KIDS multi-platform media property, The Cat in the Hat Knows a Lot About That!™ (Cat in the Hat). PBS, in partnership with the Corporation for Public Broadcasting (CPB), developed the resources (including videos, digital games, and offline activities) under the 2015–2020 Ready To Learn Initiative, which is funded through the U.S. Department of Education. Our study (Grindal, et al., 2019) was designed to explore how providing families with access to Cat in the Hat videos and digital games, along with hands-on activities, might support 4- and 5-year-old children’s learning in the areas of physical science and engineering. We think that what we learned as we developed the Cat in the Hat assessment tasks can help you think about how and why you assess children’s science learning in your EC setting.
The Cat in the Hat Assessment Tasks
We developed three individual assessment tasks—referred to here as Bridges, Slides, and Sorting, but more formally called “The Hands-On Preschool Assessments of Physical Science and Engineering”—to measure children’s learning in relation to three distinct sets of Cat in the Hat resources. Since all of the Cat in the Hat resources were developed to align with the Next Generation Science Standards (NGSS Lead States, 2013), we first analyzed the resources. We wanted to ensure that the assessment tasks would be aligned with the same NGSS Disciplinary Core Ideas (DCIs) and Science and Engineering Practices (SEPs) emphasized in the videos, games, and activities that children would engage with during the study.
All three types of resources included in the study emphasize physical science concepts related to the DCIs PS1.A Different types of matter exist and Different properties are suited to different purposes and PS2.A Pushes and pulls have different strengths and directions, as well as a range of NGSS SEPs. The NGSS SEPs are the activities that scientists and engineers use as they do their work and that all children can engage in at developmentally appropriate levels. Our tasks emphasized Planning and Carrying Out Investigations, Using Mathematics and Computational Thinking, Analyzing and Interpreting Data, and Constructing Explanations and Designing Solutions. See Table 1 for a brief description of the assessment tasks and the DCIs, and SEPs associated with each one. (To access a more fulsome description of each task, see the link at the end of this post.)
Table 1: Assessment Task Alignment to NGSS
The study included over 450 children and families in five locations across the United States. We found that access to the Cat in the Hat resources over an eight-week period substantively improved children’s understanding of:
how the properties of objects and materials (strength and length) and forces (pushes and pulls) contribute to the stability of structures (large effect size = 0.40); and
how the properties of materials (texture) and forces (friction) influence how objects move (large effect size = 0.33).
Effect sizes from the Cat in the Hat resources on the four assessments ranged from small to large impacts (from 0.11 to 0.40). You can view a summary of the study report here.
In this study we used performance-based assessment to measure children’s science knowledge and skills in order to test the effectiveness of the Cat in the Hat resources. Performance-based assessment can also help educators uncover children’s knowledge and skills at different points in time and in a way that enables them toplan relevant and responsive curriculum. What we learned in the Cat in the Hat study may be helpful to you in developing assessments in physical science for the preschool children at your setting.
1. Start with the Next Generation Science Standards (NGSS)
It is not by chance that the Cat in the Hat resources and our assessment both align with the NGSS. These standards represent the most current and comprehensive science standards available. They make it clear that, in order to learn science, children need many opportunities to DO science and to communicate what they are doing, noticing, and thinking about with interested adults. Children need teachers who facilitate their direct explorations, draw out their emerging science ideas (whether or not they are scientifically correct), and use those ideas as launching pads for further exploration. The NGSS also emphasize the close relationships between and among the STEM disciplines (science, technology, engineering, and mathematics). As an early childhood teacher in PreK-Grade 2, the NGSS can help you identify the important science “big ideas” appropriate for your children to be working toward. Although it is important not to use performance expectations beyond children’s current grade levels, all young children are developing an understanding of physical science concepts from an early age. For example, Structure and Properties of Matter does not come up in the NGSS until Grade 2, but all teachers know that children are building their understanding of properties of objects (color, size, shape) and materials (texture, hardness, flexibility) from a very young age. PreK teachers can view state or local standards aligned to NGSS or use other resources that make NGSS connections such as the NSTA Position Statement on Early Childhood Science Education, the NSTA Position Statement on Elementary School Science, or the “Early Years”column in NSTA’s Science and Children journal. What the NGSS standards don’t tell you is exactly what to teach.
2. Choose a Topic of Study
During our study, families and children engaged with Cat in the Hat resources—including videos, interactive digital games, and hands-on activities—centered on specific topics. When you engage children in topics of study (for example, light, sound, structures, bridges, ramps, water) in your EC setting over a period of weeks, they have opportunities to gain deeper conceptual understanding of relevant concepts and practice doing science. In a study of bridges, for example, you can support children as they ask investigable questions (“What building materials make the strongest, most stable bridge?”); plan and carry out investigations (such as building bridges out of a variety of blocks and other materials and testing them by adding toy cars and other vehicles); and construct evidence-based explanations (“Heavy, hard blocks work best at the bottom of a bridge because they are strong enough to hold the bridge up.”). Questions to consider when choosing topics of study include: Are there important science concepts connected to the topic and are they developmentally appropriate? Does the topic provide many opportunities for direct investigation? Can children explore this topic from a variety of perspectives? Does the topic connect to children’s everyday lives and experiences? Is the topic interesting and engaging to children and teachers? (Worth, 2010).
The topic of Bridges, for example, meets all of these criteria in the following ways:
It connects to physical science core ideas related to Structure and Properties of Matter and Forces and Motion. Children experience how pushes and pulls act on the different types of bridges they build using different objects, materials, and designs.
It incorporates investigations of many different types, styles, and designs of bridges.
It connects to children’s experiences with bridges in the real world.
It Includes many different types of objects and materials, and opportunities to design, build, create, and problem-solve.
The topic of Ramps also meets these criteria. It connects to physical science core ideas related to Structure and Properties of Matter and Forces and Motion. Children experience how objects roll, slide, or stop on ramps with different inclines and textures and engineer ramp systems. As an added bonus, physical science topics—includingBridges and Ramps—lend themselves well to engineering challenges. Questions such as, “How can you make a bridge as long as your arm that will hold six cars?” or “How can you make a ramp system that will get a ball into a bucket on the other side of the room?”challenge children to apply their learning about properties of matter and pushes and pulls to develop a solution to a problem.
3. Plan Learning Goals and Connected Activities
As we developed assessment tasks for the Cat in the Hat study, we had to come up with indicators—what we expected children to do in each task that would represent an understanding of the relevant NGSS core ideas and practices. For example, for the Slides task, we had to figure out how 4- to 5-year-old children might show and talk about their early understanding of the NGSS DCIs PS1.A Different properties of objects and materials make them suitable for different purposes and PS2.A Pushes and pulls can have different strengths and directions. We also had to decide how we thought children might demonstrate their ability to plan and carry out an investigation. We decided we would ask children to identify differences in how the slides felt to the touch (smooth, sticky, and rough); to predict which slide a toy figure would move down the fastest (the smooth slide); and to do a test to figure out which slide was “fastest” (test the toy figure on each slide).
Once you have decided on a topic of study that incorporates big science ideas, it is time to think about your specific learning goals, the learning activities you will provide to address them, and how you will assess children’s learning. It can be tempting to focus your goals and assessments on what children know. However, it is critical to incorporate goals and assessment tasks that enable children to show what they can do—how they go about looking for answers to questions and solutions to problems. With that in mind, sometimes teachers are tempted to use NGSS performance expectations (for instance, Kindergarten: K-PS2-1 Plan and conduct an investigation to compare the effects of different strengths or different directions of pushes and pulls on the motion of an object) as both the learning goal and the assessment indicator for single activities, but this represents a misunderstanding. The performance expectations tell you what you can expect children to know and be able to do by the end of a grade or grade band when they have reached mastery. They don’t elucidate the complex and often subtle progression of understanding and skills that occurs before children reach mastery. In the first activity of a Ramps study, for example, you might choose a goal such as “children will notice that changing the incline of the ramp changes how fast a ball rolls down it,” and you might expect them to try rolling balls on different ramps and noticing which ones go faster and farther. As the unit progresses, and children have more experience with investigating ramps, your goals will change to reflect their growing understanding of forces and motion and their use of NGSS Practices.
4. Use Performance-based Assessments
Once we had aligned the Cat in the Hat resources and the associated assessment tasks, we began thinking about what type of assessment would be most effective. We did a review of available EC science assessments and decided that a performance-based assessment would best meet our needs. This type of assessment is more developmentally appropriate than other types because it enables children to demonstrate what they know and what they can do in the context of play or exploration. An educational focus on 21st-century science skills favors performance-based tasks because they provide opportunities for children to apply and demonstrate critical thinking, problem solving, and innovation in action (Fadel et al., 2007). Other currently available science assessments for young children tend to prioritize content over practice, which can be problematic because knowing science concepts and knowing how to investigate science phenomena are inextricably linked. A benefit to using performance-based assessments at your setting is that they can be administered as part of the curriculum rather than separate from it. This is more developmentally appropriate for young children, who may be anxious in a test-like situation.
During a unit on ramps, for example, you could set up an activity in which children are asked to test a set of objects on a ramp, sort them into groups based on whether they roll or slide, and record their results on a chart. As children rotate through the activity, you or another adult can observe and record what they do and say. This allows you to collect data about the following:
What they understand about the concepts/words “roll” and “slide” (Which objects do they identify as rollers and sliders?)
What they understand about sorting objects (Do they sort all/some/none of the objects based on whether they roll and slide? Do they sort based on another characteristics?)
How they investigate the objects (Do they test all/some/none of the objects? Do they try placing the objects in different ways on the ramp before making a decision? Do they let objects go or push them down the ramp?)
How they record their results (Do they record all/some/none of the objects correctly? Does their recording match with how they grouped the objects?)
Performance-based assessments work particularly well to assess children’s learning in an ongoing way because they can be set up fairly easily at different points during a topic of study.
5. Construct Assessments that Work for All Children
As we developed the assessment tasks, we aimed to meet three additional criteria. We wanted to minimize the language load, measure children’s learning at different levels of mastery, and break tasks down to put the focus on NGSS Practices and to uncover increments of learning.
Minimize the Language Load
Minimizing how language-dependent the assessment was meant designing prompts that would enable children to demonstrate, point, and use body language as well as oral language to communicate. For example, in one part of the Sorting task, children are asked to show how to fix an incorrect sort. Instead of having to explain their response, they are asked to point to a picture card. In EC classrooms, there is sometimes a tendency to confuse language skills with cognitive skills and to focus on oral language as the primary way for children to communicate what they know. Performance-based assessments level the playing field somewhat for less-verbal children and Dual Language Learners, but it is important to create intentional opportunities for children to demonstrate knowledge nonverbally.
Measure Learning at Different Levels of Mastery
Measuring children’s knowledge and skills at different levels meant incorporating follow-up prompts when children either did not respond or gave an incomplete response to the question. For example, in one part of the Sorting task, children were asked to sort objects into three containers by shape. If the child did not respond, the assessor would place one round, one square, and one triangle-shaped object into each of the correct containers and repeat the prompt. In the classroom, you can make tasks more or less complex as needed. You can increase the complexity of the Ramps assessment task described above, for example, by including objects that both roll and slide, depending on how they are placed on the ramp, or by setting up the ramp so that sliding objects stop on the ramp. This would enable you to collect additional data about children’s understanding that some shapes roll or slide depending on how they are placed on the ramp, or that changing the incline can change an object’s motion. You could simplify the task by fixing the ramp in place or removing the recording component.
As we began to develop the assessment used in our Cat in the Hat study, we also decided that we would use objects, materials, and storylines that were similar to, but not the same as, the ones used in the Cat in the Hat resources. We wanted to avoid giving children the opportunity to merely copy what they had done in the games or observed in the videos, but instead to apply what they had learned to a new situation. Consider doing this in the classroom when you think children are ready. If children have mastered sorting, for example, consider asking them to sort a different set of objects than the ones they typically use. You might introduce buttons for sorting by color, shape, or size, or a set of play, art, and eating objects to sort by use. If children have difficulty sorting the new objects, you can infer they need more practice with familiar ones.
Break Tasks Down to put the Focus on NGSS Practices and Uncover Increments of Learning
In the Cat in the Hat study’s assessment tasks, we broke tasks down so we would be sure to uncover what children were able to do, even if they didn’t know the content. For example, in the Slides task, children were asked to (1) predict which slide a toy character would slide down the fastest, (2) explain why they thought so, and (3) state or show how they might test which slide the character would slide down the fastest. After testing, children were asked how their prediction compared to what happened (“Is that what you thought would happen?”). Breaking down the components of the NGSS Practice planning and carrying out an investigation enabled us to discern what aspects of the practice children were able to engage with. Although a performance-based assessment in the classroom will likely not be as structured as the study’s assessment, you can employ a similar strategy by probing children before and after they investigate. As a unit on Bridges progresses, for example, you might ask children to make and explain their predictions before they investigate. Their responses to these questions can give you a lot of information about what they understand about the properties of materials.
6. Go Forth and Assess!
The assessment tasks we developed were useful to us because they enabled children to talk about and show what they understood and were able to do in relation to the NGSS DCIs and SEPs emphasized in the Cat in the Hat resources. They allowed us to determine that the Cat in the Hat videos, games, and offline activities were effective in supporting children’s understanding of physical science core ideas and their ability to do science. The process of developing the tasks also extended our own understanding of the variety of ways in which children express their learning and the importance of creating intentional opportunities for children to demonstrate, as well as talk about, what they know and can do. If our assessment tasks are useful in relation to topics children are exploring at your EC setting, consider using them. Or better yet, use them as models for developing your own performance-based assessments! Remember that our assessments are not meant to be used to evaluate children’s learning in EC settings. Rather they are meant to be used formatively, to give you the information you need to plan curriculum that is relevant and responsive to what children already know and can do. There is a brief description of each of the tasks in the section that follows and a complete description of our assessments can be found here: http://cct.edc.org/rtl/data-collection-tools.
Hands-On Preschool Assessments of Physical Science and Engineering
Bridges (Length, Strength, and Stability): designed to assess a child’s understanding of how the properties of objects (such as size and shape) and materials (for instance, hardness and flexibility) make them suitable for building a bridge that can hold weight. Children were provided with a group of objects of different lengths and strengths, including an 8” piece of aluminum foil, 8” composition notebook cover (made of cardboard), 8” piece of laminated paper, 6” ruler, 6” piece of cardstock, 6” lasagna noodle, and a small toy car with a Duplo character driver. They were asked to investigate the bridge-building materials and figure out the most suitable object for building a bridge that could support the weight of the car and driver.
Slides (Surfaces and Friction): designed to assess a child’s understanding of how the properties of materials and forces—friction in particular—influence movement on a slide. Children were provided with three 12” wooden ramps, each with the same incline but with a different texture of material stapled on them, including rubber, felt, and steel wool, and a toy Duplo character. They were asked to describe the three different slide textures, and then to predict and justify which of these three slides would allow the toy character to slide down the fastest.
Sorting (Colors, Shapes, and Uses): designed to assess a child’s understanding of how different objects can be described and categorized based on their observable properties and common uses. Children were provided with 21 common childhood objects that incorporated a variety of different colors, shapes, and uses, including a square blue napkin, a round red plate, a plastic orange, a yellow felt triangle, a red triangle block, and a blue Cookie Monster character. Children were asked to identify similarities and differences among the color of the objects, sort the objects on the basis of shape (with picture cues), complete a sort based on use, and fix a sort based on color.
Fadel, C., Honey, M., & Pasnik, S. (2007). Assessment in the age of innovation. Education Week, 26(38), 34-40.
Grindal, T., Silander, M., Gerard, S., Maxon, T., Garcia, E., Hupert, N., Vahey, P., Pasnik, S. (2019). Early science and engineering: The impact of The Cat in the Hat Knows a Lot About That! on learning. Education Development Center, Inc., & SRI International.
National Research Council. (2013). Next Generation Science Standards: For states, by states. The National Academies Press.
A guest post by Cindy Hoisington (choisington@edc.org), an early childhood science educator and researcher at Education Development Center Inc. in Waltham MA; Regan Vidiksis, a researcher at Education Development Center with a focus on STEM teaching and learning in early education environments; and Sarah Nixon Gerard, an education researcher at SRI Education with a focus on early learning. Welcome Cindy, Regan, and Sarah!
Turn on any TV, look on social media and you’ll hear about the coronavirus and COVID-19 but what do you really need to know about it? Join NSTA for a web seminar to learn more of the science about what this virus is, where it came from, and how it makes us sick. Attendees will also learn about the science beh
Turn on any TV, look on social media and you’ll hear about the coronavirus and COVID-19 but what do you really need to know about it? Join NSTA for a web seminar to learn more of the science about what this virus is, where it came from, and how it makes us sick. Attendees will also learn about the science beh
Turn on any TV, look on social media and you’ll hear about the coronavirus and COVID-19 but what do you really need to know about it? Join NSTA for a web seminar to learn more of the science about what this virus is, where it came from, and how it makes us sick. Attendees will also learn about the science beh
Turn on any TV, look on social media and you’ll hear about the coronavirus and COVID-19 but what do you really need to know about it? Join NSTA for a web seminar to learn more of the science about what this virus is, where it came from, and how it makes us sick. Attendees will also learn about the science beh
High School Blog
The Vernier Go Direct EKG Sensor: The Heart in Action
The human heart has hidden treasures, In secret kept, in silence sealed; The thoughts, the hopes, the dreams, the pleasures, Whose charms were broken if revealed.
Or so wrote Charlotte Brontë in the poem Evening Solace. The warning of charms being broken if revealed, I’m guessing, is about the emotional content of the metaphorical heart, not its electrical activity measured in millivolts over time. So with that out of the way, lets jump in.
The EKG or electrocardiogram measures electrical activity during a heartbeat. Basically the EKG (or ECG) provides a visual display of the heart’s activity somewhat the same way an electrician uses a multimeter to diagnose a circuit. Overall, the EKG is part of the field of electrocardiography which also includes the EKG sensor, the electrocardiogram, and of course the study of the EKG data.
Vernier Software and Technology makes two education-class EKG sensors. While the operation and form factor of both is the same, one is hardwired only while the other transmits over Bluetooth or hardwire. The added feature of running wireless is presumably why the Vernier Go Direct EKG sensor costs one dollar more compared to the standard EKG sensor.
Electrodes connecting the Vernier Go Direct EKG sensor to the student measure electrical voltage over time as the heart muscle beats or goes through depolarization and depolarization which are fancy words for when heart muscle cells shifts their electrical charge distribution. The resulting electrocardiograph is a picture of the heart’s electrical activity.
Three electrodes are needed to give the Vernier Go Direct EKG sensor enough information to make a picture of the heartbeat. The simplicity of the Vernier EKG Sensor allows for quick and minimally invasive connection between sensor and student. The sticky electrodes are attached one each to the upper arms, and one on the left wrist. That’s it! Obviously a formal medical EKG uses many more electrodes, often up to a dozen, but for educational purposes, the Vernier EKG Sensor does an amazing job.
There are news reports of people learning the have serious heart problems because their Apple Watch said so. But the software of the watch (or perhaps a connected computer somewhere else) detected the arrhythmia and sent a warning that was displayed on the Apple Watch. It’s amazing to think just how advanced our technology has come since the early days of EKGs. But it’s not perfect as cardiologists would like to remind us. Either way, its a fascinating story how the Apple Watch ECG function actually works.
A wonderfully detailed history of events in the development of the EKG can be found at The ECG Library. Beginning in the year 1600, the list connects the EKG dots over the last 400 years. One of the years in particular caught my eye. According to the website, in 1949:
“Montana physician Norman Jeff Holter develops a 75 pound backpack that can record the ECG of the wearer and transmit the signal. His system, the Holter Monitor, is later greatly reduced in size, combined with tape / digital recording and used to record ambulatory ECGs. Holter NJ, Generelli JA. Remote recording of physiologic data by radio. Rocky Mountain Med J. 1949;747-751.“
Vernier Go Direct EKG sensor
At the lower end of the Vernier Go Direct EKG sensor’s skillset is its ability to measure and display the heart rate in beats per minute. This feature alone is useful in sports science and physiology classes, as well as athletic training. Because of the Vernier Go Direct EKG sensor’s wireless Bluetooth connection to a phone, tablet or computer, it’s possible to use the EKG sensor in many more environments than traditional EKG machines. With a little care, the Vernier Go Direct EKG sensor can be worn during activity as long is it remains within Bluetooth range of the device running Vernier’s software.
But the capabilities of the Vernier Go Direct EKG sensor goes well beyond simply counting heat beats, but advanced work is possible as well including electromyography. Vernier provides some sample lessons for the Vernier Go Direct EKG sensor including an introduction to electromyography.
Basic electrocardiography is also well within the wheelhouse of the Vernier Go Direct EKG sensor. Vernier’s suggested list of Vernier Go Direct EKG sensor capabilities and labs includes:
Comparing and measuring students’ electrocardiogram (EKG/ECG) waveforms.
Determining heart rate by examining the number of QRS waveforms in a series of electrocardiograms (EKG/ECGs).
Study the contractions of muscles (EMG) in the arm, leg, or jaw.
Correlate measurements of grip strength and electrical activity with muscle fatigue.
The front side of the Vernier Go Direct EKG sensor has a graphic showing the placement and color of the locations for the three leads. After several students missed the meaning of the graphic, I wrote the colors and lead placement on the back of the unit. Problem solved.
The Vernier Go Direct EKG sensor is pretty amazing given how fast this simple device can capture such as complex and magical action of the beating heart. So unable to avoid it any longer, I just have to say that using the Vernier Go Direct EKG sensor really shows science is alive and well. And no charms were broken in the making of this blog post.
The human heart has hidden treasures, In secret kept, in silence sealed; The thoughts, the hopes, the dreams, the pleasures, Whose charms were broken if revealed.
I want to know if there are ways to incorporate [science, technology, engineering, and mathematics (STEM)] into more or all subjects? How would a teacher begin to integrate English or social studies with STEM? —M, Arkansas
Children do not come to school with brains divided by subjects—we compartmentalize the subjects for administrative reasons. To help students become well-rounded I strongly believe that we should teach all subjects in an integrated manner. STEM attempts to bring together similar subjects that should rely on each other. However, we can’t even begin to teach these subjects without communication, also known as language arts. When you add social context, societal issues, ethics, and geography to STEM lessons, you have incorporated social studies. And don’t forget about the arts!
Here are some concrete ideas you may want to consider:
Foster written communication by incorporating reports and journaling activities in place of fill-in-the-blank worksheets. Reinforce verbal communication through discussion groups where students can use new terminology, brainstorm ideas, and share conclusions about data they collected. Teach students the dos and don’ts of slide show presentations and have them present research projects, lab results, pictorial essays, and more to the class.
Students can overlay data on maps, plan and discuss environmentally friendly development, debate issues and ethics related to science and technology (e.g. where to place wind-turbines, the use of pesticides or genetic manipulation, terraforming other planets, and more) to incorporate social studies.
Throughout all of this you can encourage creativity by integrating art, design, music, and movement as methods of demonstrating understanding.
I want to know if there are ways to incorporate [science, technology, engineering, and mathematics (STEM)] into more or all subjects? How would a teacher begin to integrate English or social studies with STEM? —M, Arkansas
When you think about diversity, how does it show itself? When you stand before your students, do the faces looking back at you look like your own? Most likely, your answer is “no.” Classrooms and afterschool programs are becoming more culturally, ethnically, and linguistically diverse, which is leading to both challenges and opportunities for educators.
Often students and educators do not have the same cultural, ethnic, or social background. Why does it matter, and how can you bridge this disconnect? The cultural divide between educators and students can seriously impede teaching and learning. Educators use their own cultural and experiential filters to communicate instructional messages to students; in many instances, those filters are incompatible with the students’ cultural filters. Have you considered how your culture influences your students?
Culture plays a huge role in education and in everything we do. Culture shapes our values, beliefs, social interactions, worldview, and what we consider important. It also influences how we see ourselves, how we see our students, how we relate to them, what we teach, and how we teach. It is important to recognize that culture is central to teaching and learning, as it plays a key role in communicating and receiving information, and is vital in sparking interest and effectively engaging students in science, technology, engineering, and mathematics (STEM).
A way for educators to span the cultural divide between them and their students is to build bridges using culturally responsive practices and creating inclusive learning environments. Educators who approach science teaching and learning with a culturally responsive pedagogy (CRP) are effective in bridging that cultural divide. Culturally responsive educators challenge the stereotypical deficit thinking of diverse students (e.g. “culturally deficient,” “at risk,” “low-performing”) by considering cultural differences as assets, valuing students’ strengths and skills and acknowledging each student’s potential.
So how do we engage diverse students in culturally responsive and appropriate ways? Cultural responsiveness is about developing genuine and trusting relationships with students and validating their strengths and interests. A first step to develop these relationships is getting to know your students as individuals and learning about their culture. Find out their interests and the way they operate at home and in their community. This also will help you better understand how to connect STEM to their lives and make learning experiences relevant for them.
To learn about students’ culture, we first must understand our own culture and how it affects the way we relate to students. Educators often believe that they can be neutral and objective, but their life experiences and cultures (e.g., values, assumptions, and beliefs) impact how they relate to students. Sometimes these assumptions and beliefs manifest as implicit biases, which could negatively affect students. We all have implicit biases: attitudes or stereotypes that influence our understanding, actions, and decisions in an unconscious way. We need to examine our sociocultural identities and become aware of and challenge our unconscious biases to better understand ourselves and effectively communicate and work with students.
Another key aspect of CRP is creating an inviting, inclusive learning environment. In such an environment, students believe their contributions and perspectives are valued and respected. They feel that they belong. This positively impacts students’ interest and motivation in STEM.
The need to belong is a basic human need that influences our behavior and motivations. When students feel that they belong in the learning environment, they feel connected to and accepted by their peers and teachers. They feel validated in a way that is accepting and positive, which increases their engagement and motivation. In an inclusive learning environment, educators use teaching strategies that accommodate students’ needs in terms of learning styles, abilities, backgrounds, and experiences.
Analyzing the type of environment you created for your students is a great way to begin transitioning to an inclusive learning environment. For example, do your students feel empowered and capable of discussing concerns or challenges with you or their peers? How do your teaching practices foster a learning community in which each student is valued and considered? What strategies do you use to support diverse learners?
Culture is a key element of every learning environment. It shapes both the learning environment and the experience of each student. Cultural responsiveness is a powerful approach that allows educators to improve STEM engagement and equity. Whichever strategies you use to make the learning environment more inclusive, remember to remain sensitive to and flexible about the ways diverse students think, behave, and communicate. This will help create a supportive learning environment in which students are motivated to learn, and allows them to grow intellectually, socially, and emotionally.
Alicia Santiago, PhD, is a neurobiologist and a cultural and diversity consultant with more than 10 years of experience in informal science education. She collaborates with Twin Cities PBS to develop and implement innovative direct and mass media science and health education national-level programs for the Latinx community. She can be reached by e-mail at santiago554@gmail.com.
When you think about diversity, how does it show itself? When you stand before your students, do the faces looking back at you look like your own? Most likely, your answer is “no.” Classrooms and afterschool programs are becoming more culturally, ethnically, and linguistically diverse, which is leading to both challenges and opportunities for educators.