Skip to main content

Web Seminar: NSTA Science Update: Pulse of the Planet: The State of the Climate in 2019, September 18, 2019

Map of the worldJoin us on Wednesday, September 18, from 7:00 – 8:00 pm Eastern time to learn about the state of the climate.

Map of the worldJoin us on Wednesday, September 18, from 7:00 – 8:00 pm Eastern time to learn about the state of the climate.

Map of the worldJoin us on Wednesday, September 18, from 7:00 – 8:00 pm Eastern time to learn about the state of the climate.

Map of the worldJoin us on Wednesday, September 18, from 7:00 – 8:00 pm Eastern time to learn about the state of the climate.

 

Encouraging Students to Engage in Argument With Evidence

By Michelle Monk

Posted on 2019-09-17

When I first began to shift my curriculum to support the Next Generation Science Standards, I was a bit overwhelmed! And frankly, I am still overwhelmed on some days when I work with my students to support their productive struggle and allow them to “figure things out!” I used to hear requests from my students like “It would be so much easier if you would just tell us the answers!” or “Would you please just lecture to us?” They quickly learned that the answer to both of those requests was “You will gain so much more if you work to figure it out!” We (my students and me) all know now that the shift to incorporate the science practices as identified by the NGSS into our classroom has transformed our teaching and learning space.

One of the practices I initially struggled to include was arguing from evidence. What exactly does that mean? How do you get students to productively argue their findings? How do I ensure that all students are learning and sharing their ideas?

In my opinion, the practice of arguing from evidence simply means that students must use evidence (data) found in their experiments, a video, an article, or another learning activity to support a scientific claim through reasoning. In my classroom, students develop the Claim, Evidence, Reasoning (CER) framework after much discussion with a partner, then a group, then a whole class.

I never really felt successful leading a classroom discussion using evidence until I was introduced to the process of “productive talk” to support the learning and ideas developed by students. Productive talk uses leading questions to allow students to show what they understand about a concept. After I was introduced to the idea of Productive Science Talk through the Talk Activities Flow Chart and Talk Moves while participating in an IMSP I-STEM initiative created to fully investigate and support the shift to the NGSS in Illinois, I became much more confident. This program was funded by a Math and Science Partnership (MSP) grant in Illinois from July 2015-July 2017. Our facilitator, Nicole Vick, introduced us to these tools, and they provided the support I needed to determine which talking strategy to use in class activities. I determine my end goal and purpose within the lesson, and I match the activity to the strategy that will best support the learning objective. I try to use multiple strategies to support the students’ discussions and learning styles.

When students start talking, they start learning! I have seen a dramatic increase in learning and retention now that I have incorporated productive science talk and talk moves into our learning activities.

In the beginning of the year, I must scaffold the learning activities so that the data to support the claim are easy to identify. I routinely remind my students that their claim must be supported by sufficient and applicable evidence as they work to figure out the lesson.

We work hard in the beginning of the year to develop this skill in small groups. My go-to talk move to encourage students to find the appropriate information and share is “Time to think” followed by “Partner talk.” I use this to allow students to first develop their ideas solo, then with one other person, then discuss them with their group of four. I tell the students to talk with one of “their people.”

The next steps in the discussion are organic. You will “feel” the right time to move on to table discussions, and finally, a class discussion. Developing this practice as a team is essential to ensure students are confident going forward. As I move around the classroom, I hear students asking one another which pieces of data would best support their ideas. If it seems that a student is on the cusp of the right idea, I will ask them to explain themselves, and I will use another talk move, “Say more,” in which I ask them to expand on their idea. Often when prompted, the students will figure out the right information.

We routinely use a claim, evidence, reasoning pattern to ensure students have their ideas ready to present before a class discussion ensues. After students have reached consensus within their table groups, we hold classroom discussions. Students are more motivated to share their ideas if they are confident in their responses.

When I first tried this, I was most worried about buy-in and participation from my reluctant learners. I also worried about meeting the needs of my special-needs students while challenging the highest-achieving members of the classroom! I wondered how in the world could arguing from evidence support all learners.

To ensure participation by and support for everyone, we establish classroom norms during the first week of school. I have students participate in two or three learning activities before we take a break from biological concepts and figure out how to best learn in this way. To have a productive discussion before developing group norms, I have each student independently identify 1) what they did to support the group, 2) what someone else did to support the group, and 3) what they could have done better while in the group. Each table group of four students then shares their ideas and develops “group norms.”

I then lead a discussion to establish a set of class group norms. When doing this, we find that everyone wants equal say and everyone also wants the whole class to contribute! Some groups also identify the need to stay on task and respect the ideas of everyone. I have found that my special education students are able to contribute to the smaller groups with more confidence after they hear that their peers trust them—and need them—to share their ideas. Once they share in a small group, it becomes easier to share with a classroom full of peers!

Encouraging all students to identify the pieces of data that then support their claim from a learning activity has actually increased the engagement of students of all levels in my classroom! It is exciting to hear my upper-level students interact with the students who struggle. They work together, they begin to ask each other questions, and they are able to develop appropriate and accurate conclusions when they verbalize the claim. Hosting class discussions to encourage students to “take a stand” has made my classroom more engaging. When students learn to support their claim with evidence and share their ideas, I hope they eventually use this skill in other areas of their lives for the rest of their lives!

If you have any suggestions for other activities we can use to encourage class discussions so students can learn to argue with evidence, I would love to hear them!

Michelle Monk has taught high school biology for 22 years and is currently teaching Biology 1 and Anatomy and Physiology at Eureka High School in Eureka, Illinois. She previously taught at Spring Valley Hall High School and Tremont High School, where she also taught AP Biology in addition to Biology 1 and Anatomy. Monk earned a bachelor’s degree in biology education from Illinois State University in 1998 and a master’s degree in Educational Leadership from St. Xavier University in 2004. Her passion for helping students learn complex scientific concepts through collaboration is grounded in the idea that building relationships, followed by creating lessons with relevance, is essential to having rigor in the classroom.

 

 

 

Note: This article is featured in the September 2019 issue of Next Gen Navigator, a monthly e-newsletter from NSTA delivering information, insights, resources, and professional learning opportunities for science educators by science educators on the Next Generation Science Standards and three-dimensional instruction.  Click here to sign up to receive the Navigator every month.


The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

When I first began to shift my curriculum to support the Next Generation Science Standards, I was a bit overwhelmed!

 

Arguing From Evidence to Discover the ‘Why’

By Rebecca Schumacher

Posted on 2019-09-17

In my science classroom, students look at evidence all the time. Sometimes it is in photos or videos; sometimes in charts and graphs; and sometimes we generate our own data through investigations. A more traditional approach previously used is asking for concrete answers, as in giving students a graph and asking, “How many deer were found in Cook County, Illinois, in 1967 versus 2017?” Now we are shifting our thinking and asking different, more open-ended questions.

As we transition our practices toward three-dimensional teaching and learning, we also adjust our student expectations. The practice of engaging in argument from evidence makes sense; however, once you explore the K–12 Framework for Science Education in depth, it starts to appear more intimidating. One of the best resources for keeping this practice attainable is having another person to work with, whether it is another science teacher, the ELA teacher, or even another teacher you connect with through social media such as a Facebook group member or Twitter follower. Keep reaching out to others to help share the process.

Luckily, middle school students are still eager to share their ideas. Once one of them starts talking, others can’t wait to chime in. At this age, though, it is very important to remind them to listen to one another, or more than one student will say the exact same thing another just said. Keep in mind that it takes practice; stick with it, even when it seems like the students are not understanding it, and keep pushing them to look deeper. Students are full of great insight, and we should give them a chance to show it.

Engaging in argument from evidence might also sound like a traditional debate team situation in which evidence is presented to win an argument. This isn’t quite the case in science classrooms. We aren’t trying to win; we are trying to learn. When the students can connect what we have done in more than one investigation to another related phenomenon or something from their everyday lives, then we have a successful argument using relevant evidence. We are trying to look at evidence and make sense of it, and share that understanding with others.

Let’s return to our earlier question about deer in Cook County. We can easily look at a graph and answer how many more deer were found in 2017 versus 1967. But now we want to know why: What could be happening? How do you know? These types of questions will drive the lesson in a more meaningful direction. Often we don’t know the initial reason why, but as the students begin to express ideas and have small-group discussions, they create new questions for us to investigate. This is also a prime time, however, to get off track if the groundwork hasn’t been laid.

One way to stay on track is to start the year by making a list of discussion norms. Many students bring their ELA discussion cues ideas into the science classroom, which is a huge win for any teacher. They tend to generate norms like listening to one another and not telling one another that their ideas are wrong. We also make sure to reference the actual phenomena we are investigating because it steers the discussion in the right direction.

This is the perfect moment to employ talk moves. When we want the students to go a little deeper, we could ask, “Can you tell me more about that?” or “Who else can add to what was just said?” “Can someone else say this differently?” “Where does our evidence support this idea?” Many great resources for talk moves are available and can help generate a deep discussion.

Through my work with NGSS and Next Gen storylines, I was introduced to the Talk Science Primer from the TERC Inquiry Project. Its suggestions are so useful for getting students to delve deeper when arguing with evidence. For example, when a student makes an initial observation, the teacher could ask them to say more about it, or ask if someone else can say it differently. This keeps the students accountable. Are they really listening to one another behind those blank looks? Sometimes they give you the 100% right answer, but you don’t want to discourage others from responding, so you have to put on your best “that was very interesting” face and keep going. That strategy also works for the most bizarre answers as well: “Oh wow, which is one thing I hadn’t thought of; anyone else?”

As we progress in our understanding and implementation of NGSS and storylines, it is important to remember that no one best way exists to handle every situation. Every teacher is different; every student is different. Some will get there faster; some will get there slower; some might not even get there until next year.

Constant communication is important. Talk to the other teachers at your grade level, talk to the other teachers on your team, in your department, at a conference. Have any relatives who are teachers? Talk to them, too; I know I do.

Don’t feel like you are in this alone, even if you are the only grades 6–12 science teacher in your tiny rural school, or the only science teacher in your building who has ever had three-dimensional science training. Maybe you are the teacher who just heard of NGSS or the Framework for the first time; that’s okay! Remember to keep moving forward, keep asking your students questions, and keep making your students examine the data to use as evidence when developing arguments to try to discover the “Why?”

Rebecca Schumacher is a sixth-grade science teacher at Hickory Creel Middle School in Frankfort, Illinois, where she is fortunate to work with a great team of teachers. She wants to acknowledge Erin Nemeth, the other sixth-grade science teacher, because their collaborative efforts have made this all so much easier. Schumacher received her master’s degree in science education from Montana State University and is National Board–Certified. She has been working with NGSS since 2014, from writing storylines to facilitating training for other teachers throughout Illinois.

 

 

Note: This article is featured in the September 2019 issue of Next Gen Navigator, a monthly e-newsletter from NSTA delivering information, insights, resources, and professional learning opportunities for science educators by science educators on the Next Generation Science Standards and three-dimensional instruction.  Click here to sign up to receive the Navigator every month.


The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

In my science classroom, students look at evidence all the time. Sometimes it is in photos or videos; sometimes in charts and graphs; and sometimes we generate our own data through investigations. A more traditional approach previously used is asking for concrete answers, as in giving students a graph and asking, “How many deer were found in Cook County, Illinois, in 1967 versus 2017?” Now we are shifting our thinking and asking different, more open-ended questions.

 

Pairing Literacy and Science to Effectively Teach Argumentation

By Judine Keplar and Carrie Launius

Posted on 2019-09-17

Most elementary teachers, have many opportunities to learn best practices in English Language Arts (ELA), but few in science. Three years ago David Crowther, NSTA past president (2016–17), said in his conference keynote, “Of the eight practices of science and engineering, four of them are language intensive and thus require students to use multiple domains of language, including nonverbal modalities, and a range of language registers,” This challenged our thinking about how the Common Core State Standards (CCSS) in English Language Arts (ELA) taught argumentation to elementary students. Argumentation is not mentioned anywhere until middle school, yet the Science and Engineering Practices (SEPs) require students to not only think about argumentation, but also to begin to practice it as young as kindergarten.

Lee (2017)[1] notes another key difference in contrast to the CCSS, in which the term argument is withheld until grade 6: The NGSS expect children from as early as kindergarten to construct an argument with evidence. Thus, “the practice of arguing from evidence is expected consistently throughout K–12.” (p. 97) Since the NGSS indicate that argumentation can be taught and used in the elementary grades and that asking our young scientists to engage in argumentation isn’t an impossible task, the question becomes How?

After examining the NGSS progressions for the SEPs, it is quite evident that argumentation needs to be introduced at the start. For most elementary teachers, ELA is what they know, so we decided to use a book to help students identify argumentation, and see if they would be able to make claims about the evidence that is presented in the book.  We knew that integrating ELA and science was going to be a challenge.

One book that helps to illustrate argumentation and collection of evidence is the classic Starry Messenger by Peter Sis (ISBN 978-0374470272). Written for grades 1–6, this book tells the story of Galileo Galilei and how he used the findings of Copernicus to reiterate that the universe as he knew it was heliocentric. Copernicus realized he did not have enough evidence to proclaim his findings, but he took copious notes so that someone else could use them to not only make the claim that the Earth revolves around the Sun, but also have the evidence to prove the claim’s validity.

The book presents three opportunities for students to identify types of arguments. Although we know from history that religion played a major part in the belief that the Earth was the center of the universe, the book only briefly touches on the point, saying, “It’s tradition.” This gives us the first opportunity to discuss arguments. At this point, we ask our students questions to help them begin to identify differences among arguments. For younger students, ask, “Just because it is a tradition, does that make it a fact or true?” Ask them to further explain what they believe. For students in grades 3–5, questions should be more open-ended.

The second opportunity to discuss argumentation comes at the point in the book when Copernicus has gathered many findings, but says he cannot publish them. Ask students, “Why do they think Copernicus is not able to make a claim and argue about his findings? What might he be missing?” Students in grades K–5 will be able to discuss what they think. During the discussion, students will be presenting their own arguments. Ask them to give examples from the book supporting their thinking.

Galileo had evidence to support that the Sun was the center of the solar system, but he was persecuted for his findings. Students have the opportunity to not only argue whether he was treated fairly, but more importantly, why his discoveries were accurate and what evidence he had to support his findings.

Asking high-level questions like What are the differences among the various arguments presented in the book? Provide evidence to support your claim; and If you were born at the time of Galileo, which side would you take and what evidence would you provide? allow students to engage in SEP at the elementary level.

K–2 grade band

  • Identify arguments that are supported by evidence.
  • Distinguish between opinions and evidence in one’s own explanations.

3–5 grade band

  • Distinguish among facts, reasoned judgment based on research findings, and speculation in an explanation.

By using trade books to engage students in the SEP of argumentation of evidence, they will have a plethora of opportunities to use argumentation well before they are asked to do it in an ELA classroom. Christine Royce, NSTA retiring president (2019–2020), offered the thought that “Explicitly modeling for students and providing them opportunities to learn about and compare/contrast the use of evidence is one such opportunity to utilize a cross- discipline practice. The increased connections between experiences in which the students participate assists them in transferring their learning. Finding the overlap between argumentation in the science classroom and literacy studies will provide students opportunities to develop, consider, and respond to ideas in writing, verbally and scientifically.”

We are reminded of this every day as we examine the graphic below.

Reference

[1] Common Core State Standards for ELA/Literacy and Next Generation Science Standards: Convergences and Discrepancies Using Argument as an Example, Okhee Lee https://journals.sagepub.com/doi/abs/10.3102/0013189X17699172

Judine Keplar is English Language Arts Curriculum Specialist for St. Louis Public Schools in St. Louis, Misssouri, and also serves as the current president of the Board of Education in Belleville Public School District 118 in Belleville, Illinois. She is a passionate promoter of cross-curricular literacy at all grade levels and enjoys her work writing curriculum and providing professional development around all things literacy-related. She resides in Swansea, Illinois, with her husband and two children.

 

 

Carrie Launius is Science Curriculum Specialist for St. Louis Public Schools in St. Louis, Missouri. She previously was the NSTA District XI Director and president of Science Teachers of Missouri (STOM). She believes using trade books to support science learning is essential for students. She was instrumental in developing and implementing the Best STEM Book Award for NSTA-Children’s Book Council. Her passion is supporting teachers and helping them grow professionally. She resides in St. Louis near her two grown children and with her son and four dogs.

 

 

Note: This article is featured in the September 2019 issue of Next Gen Navigator, a monthly e-newsletter from NSTA delivering information, insights, resources, and professional learning opportunities for science educators by science educators on the Next Generation Science Standards and three-dimensional instruction.  Click here to sign up to receive the Navigator every month.


The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

Most elementary teachers, have many opportunities to learn best practices in English Language Arts (ELA), but few in science.

 

Next Gen Navigator

Helping Students to Argue from Evidence

Posted on 2019-09-17

 

Get students hooked on real-world science

By Claire Reinburg

Posted on 2019-09-16

When students see the real-world applications of science, their interest levels soar. Teachers are experts at guiding students through the wonders of how science helps us answer questions and engineering helps us solve problems. This month, stock up on new lessons from NSTA Press books that will spark student interest. You’ll be smiling as they marvel at the everyday applications that take science lessons beyond the classroom.

Explore Natural Hazards

Book cover image of "Natural Hazards"

The just-published STEM Road Map series volume Natural Hazards, Grade 2, will help elementary students learn about the effects of natural hazards on people, communities, and the environment and consider how threats to human safety from natural hazards can be minimized. Download the lesson “Let’s Explore Natural Hazards” and lead your students on an exploration of the types of natural hazards that occur around the world. Your students will learn that natural hazards can be classified as those with weather-related causes and those caused by Earth’s movements. With hurricane and tornado images and information in the news, students are curious about what causes these types of hazards and how their effects can be minimized. Check out all the topics covered in the growing STEM Road Map series, edited by Carla C. Johnson, Janet B. Walton, and Erin Peters-Burton.

Ask Students to Study What Makes a Great Playground

Book cover image of "It's Still Debatable"

At recess time, students can bring their studies of physical science down to earth with an engaging lesson from Sami Kahn’s new book It’s Still Debatable! Using Socioscientific Issues to Develop Scientific Literacy, K–5. Download the lesson “Swingy Thingy: What Makes a Great Playground?” to launch your elementary students on a study of playground swings. They’ll get to model swings as they apply their knowledge of forces to support arguments for safe and enjoyable playground design. Along the way, students develop blueprints and work collaboratively to design a dream playground. Other lessons in this new book prompt students to investigate questions like “Do we need zoos?”, “Which alternative energies are best?”, and “Is football too dangerous for kids?” Explore even more key questions using lessons from the first book in this series, It’s Debatable! Using Socioscientific Issues to Develop Scientific Literacy, K–12, by Dana Zeidler and Sami Kahn.

Explore Phenomena That Sparked Engineering Innovations

Focus on innovations sparked by accidental observations with the 22 easy-to-use investigations in Discovery Engineering in Physical Science: Case Studies for Grades 6–12, by M. Gail Jones, Elysa Corin, Megan Ennes, Emily Cayton, and Gina Childers. Your middle and high school students will use real-world case studies to embark on investigations of actual scientific discoveries that began with observations of phenomena and led to inspired inventions and applications. Download the lesson “A Sticky Situation: Gecko Feet Adhesives” to have your students explore how studying the mystery of geckos’ ability to hang by even one toe from a piece of glass led scientists to make an adhesive that mimics the properties of a gecko foot. Other case studies in the book focus on topics from shark skin and bacteria to the history of the Slinky. Check out the next volume in the series that’s coming soon, Discovery Engineering in Biology: Case Studies for Grades 6–12.

Explore NSTA Press’s Newest Books and Save on Shipping

Through October 31, 2019, get free shipping on your order of $75 or more of NSTA Press or NSTA Kids books when you use promo code SHIP19 in the online Science Store. Stock up on books that cover all grade ranges and span key topics like climate change, engineering, physical science, and life science. Browse the catalog online or view the new and forthcoming books. Offer applies to new purchases made online through the NSTA Science Store between now and October 31, 2019.

When students see the real-world applications of science, their interest levels soar. Teachers are experts at guiding students through the wonders of how science helps us answer questions and engineering helps us solve problems. This month, stock up on new lessons from NSTA Press books that will spark student interest. You’ll be smiling as they marvel at the everyday applications that take science lessons beyond the classroom.

Explore Natural Hazards

Picture-Perfect Science Online Course, Jan 15

A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
 

Launching the PocketLab Voyager

By Edwin P. Christmann

Posted on 2019-09-13

Intro

Exploring motion, light, temperature, altitude, and magnetic fields can be taken to new interactive heights with the PocketLab Voyager by Myriad Sensors.  Subsequently, by using a wireless sensor, the PocketLab Voyager records and stores data that can be shared with the free PocketLab “app.”

Once coupled, both the “app” and PocketLab device work together to create a variety of  experiments.  Hence, the PocketLab Voyager is designed for users as young as fourth grade; yet, sophisticated enough for engineers, mathematicians, and scientists. 

The PocketLab Voyager operates on a wireless Bluetooth 4.0 connection. Therefore, users are able to collect data using their smartphones, tablets, and computers, which makes accessibility of the PocketLab device simple. Additionally, the PocketLab Voyager has the ability to integrate data with Scratch, Google Drive, and Microsoft Excel. Once integrated,  exploring motion, light, temperature, altitude, and magnetic fields can be taken to new interactive heights for recording and storing data. 

Image 1: The PocketLab Voyager 

How to Use

Before using the PocketLab Voyager for the first time, it is essential to make sure that the battery is fully charged. To charge the battery, use the orange micro USB cord that is included with the purchase of the PocketLab Voyager. Doing this will take approximately 60 minutes to fully charge. Once fully charged, a red light will stop blinking on the front of the device. After verify that the device is charged, users can follow the PocketLab Voyager Getting Started Guide in order to download the free “app” to connect the PocketLab sensor to their chosen device. A hardcopy of the instructions is included with the purchase of a sensor or can it be accessed on PocketLab’s website at https://drive.google.com/file/d/1Ds4fzhNVG1RRf4xrKrzQeVW45ktxCto6/view.

We found that the instructions were easy to navigate and the device was ready to collect data within 10 to 15 seconds of pairing.   Moreover, after the set-up with a compatible device, users are prompted and grant access to the camera and microphone.  Once this is accomplished, PocketLab can record a video or capture a graph, combining the collected data in real time. Furthermore, enabling access to the camera and microphone allows users to “record up to 30,000 measurements to the on-board memory,”  which enables users to toggle between sensors, change the points/second feature,  move between units of measurement, and compare up to three sensors at a time.

Video 1: Getting Started with the PocketLab Mobile App

Once comfortable with the “app” and the settings preference it time to begin experimentation! With every purchase, PocketLab includes a series of nine getting started activity cards that provide information about topics such as barometric pressure, gyroscope, and velocity. These activity cards outline how the PocketLab sensor can assist users to record meaningful data for experimentation.  For assistance, users can always refer to the instruction manual tab located at: https://www.thepocketlab.com/educators/resources.

Image 2: PocketLab Voyager Instruction Manuals

The Rangefinder

An interesting feature of the PocketLab Voyager’s is the Rangefinder, which makes it possible to gather data through infrared technology. To use it, it’s necessary to open the PocketLab “app” and pair their device with the sensor. Once the devices are paired, the user can then turn on the rangefinder to display the position and velocity graph.

As the PocketLab Voyager moves, the infrared sensor calculates the Voyager’s “distance from the surface which is then used to calculate its velocity.” Purchased separately, the PocketLab device can also be attached to a cart, which will enable the user to calculate kinetic energy, explore energy transfer, and understand how changes in momentum effect the results of the study. Check out the tutorial below for more details!

Video 2: Using PocketLab Voyager’s Rangefinder

 What’s Included

· 1 PocketLab Voyager

· 1 Protective Carrying Case

· 1 Set of Getting Started Activity Cards

· 100+ Lessons and Activities

· Micro USB Charging Cable

What Needs Purchased Separately

· Temperature Probe ($9)

· Tactile Pressure Sensor ($27)

· Silicone Protective Case ($10)

· Classroom Set Case and Charger ($58)

· Five Port USB Charger ($19)

Classroom Uses

For planning units and lessons, the PocketLab website provides a myriad of resources for educators to use in their classrooms. Educators will quickly find that Myriad Sensors offer four different sensors. In addition, various classroom kits, sensor accessories, and advanced STEM kits are available for use in conjunction with any PocketLab device.

Teachers also have access to a lesson plan directory listing hundreds of lessons for students of all ages. Additionally, educators can post their own ideas and classroom experiences with their PocketLab devices on a community blog which connects educators from around the globe.

Check out the links below for immediate access to hundreds of classroom resources! –https://www.thepocketlab.com/educators/lesson-plan-directoryhttps://www.thepocketlab.com/educators

Tips for Getting Started

Before working with a new PocketLab device, be sure to check out the PocketLab website and review the plethora of resources available. Once there, you will see multiple video resources that will help you get the most out of your sensor. Follow this link to gain access to PocketLab’s how-to videos: https://www.thepocketlab.com/educators/resources.

From our experience, we have found that the PocketLab Voyager to be the perfect interactive tool to engage students in ways that transform the way they view science to be an interactive observational tool for the world around them.

Specifications

· Wireless Connection: Bluetooth 4.0

· Battery: Rechargeable via micro USB

· Battery Life: 8 hours (wireless, full data rate) 12 hours (low power, logging mode)

· Wireless Range: 250 feet line-of-sight

· Memory: 30,000 data readings

· Durability: 2 m (6 ft) drop protection

· Dimensions: 3.8 x 3.8 x 1.5 cm (1.5 x 1.5 x 0.6 in)

· Weight: 17 g (0.6 oz)

* For specific sensor specifications, check out the following link! https://www.thepocketlab.com/specs

Cost

$148.00

About the Authors

Edwin P. Christmann is a professor and chairman of the secondary education department and graduate coordinator of the mathematics and science teaching program at Slippery Rock University in Slippery Rock, Pennsylvania. Marie Ellis is a graduate student at Slippery Rock University in Slippery Rock, Pennsylvania.

Intro

Exploring motion, light, temperature, altitude, and magnetic fields can be taken to new interactive heights with the PocketLab Voyager by Myriad Sensors.  Subsequently, by using a wireless sensor, the PocketLab Voyager records and stores data that can be shared with the free PocketLab “app.”

 

Labs in Life Sciences

By Gabe Kraljevic

Posted on 2019-09-13

I am a preservice biology teacher and was hoping to get some insight on labs. What are some of your favorite labs that you have done with your class and what made them a success? How do you typically assess labs?

—D., Virginia

I believe biology becomes much more intriguing for students by incorporating as many hands-on activities as possible.

Not all labs require formal lab reports or quantitative analysis. I feel it is good to vary your assessments to suit the activity. Quick observation labs can be assessed with worksheets. Long-term projects are well-suited to journaling. Paragraph answers foster higher-order thinking skills, promote literacy, and challenge students to reach conclusions and connect topics. Presentations are good assessments.

My favorite hands-on biology activities are:

DNA extraction: Isolate DNA from crushed strawberries or other fruit. Students are amazed at how much DNA can be extracted.

Pop bottle ecosystems: Students are motivated to observe the terraria they created themselves. Peruse my collection in the Learning Center at http://bit.ly/PopBottleEcosystems

Observing living organisms: Start seeds to explore life-cycles, requirements for growth, tropisms, and more. Insects, worms, and pond aquariums are fun. (I discussed this in more detail previously. You can read the blog post at http://bit.ly/2KTDA22.)

Dissections
Dissections can be fascinating and useful for discussing the ethical, scientific use of animals. Molluscs, fish, and crabs can be purchased from grocery stores. Fetal pigs can be particularly impactful. Don’t forget to include plants, flowers, fruits and vegetables. (Check with your department head or administration and be aware of cultural issues.)

Ecological studies
Students can conduct field surveys of the school yard with small quadrats made from soda-straws. Run transects, identify species, estimate biomass, and write reports on what was discovered.

Hope this helps!

Image by OpenClipart-Vectors from Pixabay

I am a preservice biology teacher and was hoping to get some insight on labs. What are some of your favorite labs that you have done with your class and what made them a success? How do you typically assess labs?

—D., Virginia

I believe biology becomes much more intriguing for students by incorporating as many hands-on activities as possible.

 

Feature

Evaluating Nature Museum Field Trip Workshops, an Out-of-School STEM Education Program

Using a Participatory Approach to Engage Museum Educators in Evaluation for Program Development and Professional Reflection

Connected Science Learning July-September 2019 (Volume 1, Issue 11)

By Caroline Freitag, and Melissa Siska

Evaluating Nature Museum Field Trip Workshops, an Out-of-School STEM Education Program

Out-of-school learning opportunities for students in science, technology, engineering, and mathematics (STEM) represent an important context in which students learn about and do STEM. Studies have shown that in- and out-of-school experiences work synergistically to impact learning. New education guidelines, particularly the Next Generation Science Standards (NGSS), place a significant focus on increasing students’ competence and confidence in STEM fields, and note that “STEM in out-of-school programs can be an important lever for implementing comprehensive and lasting improvements in STEM education” (NRC 2015, p. 8).

Figure 1

A museum educator holds a seed specimen for students to observe during A Seed’s Journey. In this workshop, students discover different methods of seed dispersal and participate in an investigation to determine the seed dispersal methods employed by a variety of local plants.

 

Effective out-of-school STEM programs meet specific criteria, such as engaging young people intellectually, socially, and emotionally; responding to young people’s interests, experiences, and culture; and connecting STEM learning across contexts. Effective programs also employ evaluation as an important strategy for program improvement and understanding how a program promotes STEM learning (NRC 2015). In the past decade, the National Research Council (2010) and the National Science Foundation (Friedman 2008) have published materials citing the importance of rigorous evaluation and outlining strategies for implementing evaluations of informal science education programs. The Center for Advancement of Informal Science Education specifically targeted their 2011 guide toward program leadership to empower them to manage evaluations and work with professional evaluators.

In this article, a team of collaborators from the education department of the Chicago Academy of Sciences/Peggy Notebaert Nature Museum (CAS/PNNM) share the outcomes of a recent evaluation of Field Trip Workshops, a STEM education program in which school groups visit the Nature Museum. The purpose of the evaluation was to collect data about the program’s impacts and audience experiences as a foundation for an iterative process of reflection and program development. We hope this case study provides ideas for other institutions.

Field Trip Workshops program

School field trips to cultural institutions provide rich opportunities for out-of-classroom learning. At CAS/PNNM, school groups can register for Field Trip Workshops to enhance their museum visit. During the 2017–2018 academic year, educators at the Nature Museum taught 900 Field Trip Workshops totaling over 23,500 instructional contact hours with classroom teachers and students. Field Trip Workshops account for approximately one-fifth of the total contact hours between the education department and Chicago schools and communities, making this program the department’s largest.

Program structure

Field Trip Workshops are one-time, 45-minute programs taught by the museum’s full-time, trained educators, who use museum resources to enrich student learning. These hands-on, inquiry-based programs engage students with the museum’s collections and specially cultivated habitats on the museum grounds. Workshops connect to NGSS science content and practices while remaining true to the museum’s mission: to create positive relationships between people and nature through local connections to the natural world. Field Trip Workshops are designed for students in prekindergarten through 12th grade, with both indoor and outdoor workshops offered at each level. These workshops are an opportunity for students to learn science content, engage in science practices, and create personal connections to local nature through interactions with museum resources. Workshops highlight the museum’s living animals, preserved specimens, and outdoor habitats, and often discuss the conservation work done by the museum’s scientists.

All workshops emphasize students’ interactions with the museum’s living and preserved collections or outdoor habitats. Workshops are offered for different grade bands and correlate to the Next Generation Science Standards’ disciplinary core ideas (DCIs) and science and engineering practices (SEPs).

Education team

The education staff of the Nature Museum come from a variety of professional backgrounds. Several members hold advanced degrees in science, instruction, or museum education. Staff have experience in informal education settings and/or are former classroom teachers. Their diverse training and skills create a team with science content and pedagogical expertise.

Figure 2

Field Trip Workshop students in Exploring Butterfly Conservation inspect a case of mounted butterfly specimens while discussing their prior knowledge of common Illinois butterflies. Students are then introduced to two rare butterflies for deeper investigation.

 

Planning the evaluation

Setting the stage

Basic evaluation is a regular part of growing and developing the Field Trip Workshops program. Museum educators participate in team reflections, revise curriculum, and review workshop alignment with overall program goals on an ongoing basis. Because Field Trip Workshops are a fee-based program without a grant budget or a funder to report to, there had never been sufficient time or resources to support a comprehensive, mixed-methods evaluation. Still, the desire to more fully evaluate the program was present in the minds of the Nature Museum’s education department leadership and program staff. Thus, the in-depth evaluation of the Field Trip Workshops program reported here was an unprecedented opportunity.

In 2014 the Nature Museum developed and implemented an organization-wide 10-year plan. Evaluation of all programs is a key component of the education department’s plan. During a retreat in spring 2017, the staff reflected on their department’s work in achieving these goals and agreed that Field Trip Workshops needed detailed evaluation to grow and refine the program. Team members observed that, although staff worked diligently to align each workshop with the program’s goals, they had never obtained large-scale feedback from program participants to see whether those goals were truly being met.

An evaluation opportunity and its challenges

In the spring of 2019, a new museum volunteer with experience in research expressed an interest in out-of-school STEM experiences and learning more about the museum’s education programs. After careful consideration, education department leadership and the student programs manager chose to seize the opportunity, and an evaluation team including the student programs manager, the volunteer, and two members of department leadership was formed.

The education staff was interested in contributing to an evaluation, but it was clear that staff time would be limited. Thus, the team realized that achievable evaluation objectives (i.e., choosing only the most popular workshops to evaluate rather than the entire suite, setting a six-week data collection time limit) and staff buy-in (i.e., focusing on staff questions/priorities) would be required for the evaluation to succeed. The evaluation team kept a careful eye on the evaluation’s impact on education staff, asking them to be upfront about any concerns (e.g., disruptions to the normal schedule or difficulties with the evaluation/evaluator) and standing ready to make adjustments as necessary (e.g., rescope or end the evaluation).

Collaborating with experienced volunteers to facilitate the evaluation

The program evaluation was greater in scope than would have been possible without additional support while having a minimal imposition on staff members and programming because a museum volunteer served as the primary evaluation facilitator. The volunteer had research experience in biology and psychology, but was making a career change toward STEM education, and thus was particularly interested in learning more about informal education and program evaluation. As a result, the collaboration gave the volunteer an opportunity to learn and develop new skills, while the museum benefited from the volunteer’s research experience.

Practical consideration: Finding volunteers

Finding volunteers with enough background to support evaluation can be challenging, and all volunteers’ skills, interests, and outlooks should fit the organization and the project. Volunteers may want to contribute to evaluation efforts due to interest in the project, support for the institution or program, or to gain experience. For example, graduate students or senior undergraduates may be interested in facilitating evaluation or collaborating on a larger project. Additionally, retired professionals or part-time workers may have the necessary expertise and flexibility to donate their time to a project. Depending on background, skills, and time commitment, a volunteer may be able to help recruit participants, conduct interviews, distribute surveys, follow up with participants, and enter and visualize data. Volunteers may also help with tasks such as keeping records, scheduling meetings, and providing support for staff members.

Choosing a participatory evaluation approach

Participatory evaluation is defined as a collaboration between evaluators and those who are responsible for a program and will use the results of the evaluation. Participatory evaluation involves these people in the fundamental design, implementation, and interpretation of the evaluation. While the trained evaluator facilitates the project, the evaluation is conducted together with program staff (Cousins and Earl 1992). Participatory evaluation’s theoretical underpinnings rely on the idea that evaluation is only effective if results are used (Brisolara 1998). Utilization-focused evaluation further argues that evaluation should be useful to its intended users (Patton 1997), and usefulness can be limited if the results are not suited to those users (e.g., evaluation questions are not relevant or reports are not tailored to the audience). Participatory evaluation seeks to encourage use of evaluation results by involving those who are most interested in the outcomes in conducting the evaluation itself.

The Nature Museum’s education department was the primary intended user of this evaluation, as they implement the program and play a vital role in its development. The intent of the evaluation was to generate information that would aid in improving the program. Although this evaluation project did not use a professional evaluator, the volunteer’s research background was useful in designing the study. Education department leadership was also familiar with evaluation basics from collaborating with and training under the research and evaluation consultancy that conducts evaluations for the department’s grant-funded programs.

The evaluation team took inspiration from the participatory evaluation model specifically because it was a good fit for the education department culture. Museum educators already contribute to defining directions for programs and recommend curricular improvements. Thus, it made sense for these educators to contribute their insights to evaluation as well. Most importantly, educators’ involvement in evaluation gave them information about their highest priorities: identifying patterns of success, challenges in program implementation, and key characteristics of effective interactions with students.

Participatory approach: Museum educators’ experiences were an evaluation focus

At the start of the evaluation, the student programs manager held a meeting for all museum educators to discuss priorities for evaluating Field Trip Workshops. Together, they identified questions about student experiences and questions to ask of their classroom teachers. The educators also described factors that made them “feel like I taught a good workshop” and those that made a workshop feel less successful.

The evaluation team realized educator viewpoints would provide a critical third perspective alongside student and classroom teacher experiences, and that key insights might lie in the similarities and differences between how museum educators, students, and their classroom teachers perceived the same workshop.

Figure 3

Students gather at the sides of the walking path in the prairie to make scientific drawings of their observations of the habitat, while a museum educator discusses local plants and animals.

 
Participatory approach: Museum educators’ questions steered the evaluation direction

After the initial staff meeting, the evaluation team reviewed the educators’ evaluation questions and comments. They then organized the questions, reviewed possible methods for answering the questions, and considered the practicality of collecting data. Ultimately, the questions of highest priority that also met the practical limitations of data collection were selected. The remaining questions were set aside for a future opportunity.

The education staff and evaluation team articulated three evaluation questions:

  1. To what extent were classroom teachers and museum educators satisfied with their Field Trip Workshop, and to what degree do they think student needs were met?
  2. In what ways do Field Trip Workshops impact students during and after their experience, specifically (a) student engagement with and attitudes toward science and nature, (b) their science content and practice learning, and (c) the types of personal connections students made to science and nature (e.g., “I have a pet turtle like that!” or “We’re raising butterflies in class”)?
  3. In what ways do Field Trip Workshops connect to in-school science (e.g., the school science curriculum, the workshop’s relevance to in-school work, similarities and differences in the types of learning experiences in the workshop compared to classroom)?

Conducting the evaluation

There were several practical considerations that affected the scope of the evaluation plan.

Practical considerations: Museum educators’ schedule impacted the evaluation scope

The evaluation team decided to include only the five most popular Field Trip Workshops in the evaluation. These workshop topics are taught frequently enough that sufficient data could be collected: 10–12 workshop sessions in each topic during the six-week data collection period, which avoided the complication of carrying the project over into the next academic year. In total, the evaluation included 13 museum educators, approximately 60 classroom teachers, and about 1,500 students.

The evaluation team chose data collection methods to fit not only the evaluation questions, but also the program schedule, staff responsibilities, and evaluation time frame. First, the team decided to have the evaluation volunteer sit in and observe workshops. Together, the evaluation team developed an observation protocol so that the volunteer could take notes about student learning and record student comments about personal connections to science and nature.

Then, the team wrote a short series of follow-up questions for museum educators to answer after each observed workshop. These included rating and free-response questions about their satisfaction, perception of student learning, and personal connections students shared with staff.

Practical considerations: Museum educators’ responsibilities affected the evaluation plan

The 45-minute workshops are scheduled back-to-back with a 15-minute “passing period,” during which museum educators reset classrooms for the next workshop. In the end, it was decided that the volunteer would ask educators follow-up questions in an interview. This way, educators could prepare for their next group and participate in the evaluation. After the interview, the volunteer helped educators prepare for the next workshop.

Finally, the evaluation team developed a follow-up survey for classroom teachers. To streamline the process, the team put together clipboards with a brief request-for-feedback statement and printed sign-up sheets for classroom teacher contact information for education staff to use when approaching teachers. The evaluation team made sure that clipboards were prepared each day, rather than asking museum educators to organize survey materials in addition to their regular workshop preparation.

Program evaluation findings

The Field Trip Workshop evaluation explored three questions (listed above) by collecting data from students, their classroom teachers, and museum educators.

Overall, classroom teachers reported they were “very satisfied” (75%) with their workshops, largely because they thought the workshops and the museum educators who led them did a good job of meeting students’ needs. Classroom teachers attributed their sense of satisfaction to their assessment of the staff’s teaching, the workshop curriculum, students’ learning and engagement, and their overall experience at the Nature Museum.

Museum educators were usually less satisfied than classroom teachers. Staff linked their satisfaction to perception of student experience, student learning outcomes, and reflections on lesson implementation. Staff often attributed their modest satisfaction to perceived imperfections in implementation and to a high standard for student experience and learning. Positive teacher feedback encouraged museum educators to view their own work in an optimistic light, and they found it valuable to learn more about the factors underlying teacher satisfaction.

The second evaluation question asked in what ways students were impacted by their workshop both on the day and at school in the days/weeks following. Nearly three-quarters of classroom teachers reported their students’ engagement was “very high” during the workshop, and along with staff and volunteer observations, commented that engaging with the museum’s collections and habitats was a novel type of learning experience for many students.

After their Field Trip Workshop, teachers observed that in class and during free time, their students demonstrated improved awareness of living things and increased interest and curiosity. Classroom teachers also reported that student attitudes toward science and nature improved, a view shared by most museum educators. The education staff credited this change to reduced anxiety with living things/the outdoors, increased connection to living things, and growing confidence in new environments. The education team found it validating to see that improvements in student attitudes toward science and nature were attributed to object-based learning with the museum’s unique resources.

The evaluation also explored workshop impacts on student learning and practicing of science. Although classroom teachers said their students typically learned the science content and practices explored in their workshop “excellently” or “well,” museum staff held a slightly different view. Museum educators reported reduced student learning in many Bugs Alive! workshops and some outdoor workshops (Habitat Explorers, Ecology Rangers). A deeper exploration of these workshops revealed (1) the difficulties staff experienced in promoting effective chaperone behavior and (2) the challenges of teaching outdoors under variable conditions. These findings served as the basis for immediate reflection, group discussion, and program improvements.

The evaluation also asked classroom teachers and museum education staff what kinds of personal connections to nature and science students made during their workshops. Students made many personal connections as evidenced by their comments to classroom teachers, classmates, museum educators, and the evaluation volunteer. Analysis showed that students make a variety of internal and external connections to science and nature through workshops. Internal connections are those that link workshop experiences to students’ internal worlds, including prior emotional and sensory experiences or things the student knows and can do (such as: “I’m thinking of being a zoologist when I grow up” “It [wild bergamot] smells like tea!” “I had a goose. It got caught, and we rescued it”). External connections are those that students make between the workshop and people, places, and things (such as: “We had a lizard and two turtles in our class” “My group [project] at school is [on] woodlands” “I have one of those [flowering] trees, it’s similar, at my house”). Although museum educators occasionally hear personal connections from students, they were delighted by the variety and depth of the student statements captured by the volunteer and reported in the evaluation.

The last question focused on connections between in-school science and Field Trip Workshops. Classroom teachers reported contextualizing their field trip within their current science curriculum (unit kickoff, in-unit, unit-capping activity) most of the time. Only one in six workshops was a stand-alone experience (i.e., not explicitly connected to the science curriculum). Teachers also said they valued their workshop experience because it differed from classroom work by using resources they usually could not access: living things, preserved specimens, and outdoor habitats. Further, classroom teachers were glad their students were engaged in doing science: exploring, observing, and recording. Teachers were also pleased with workshop elements’ similarities to the classroom, including familiar teaching methods used by the trained museum educators. The education staff was happy that their prior anecdotal observations regarding close connections between their workshops and what students experience in school were confirmed by evidence.

Figure 4

On the Nature Walk during a stormy day, Habitat Explorers students make and record observations of the woodland habitat with guidance from a museum educator. Students take these “nature journals” home to share or to school for further development.

 

Engaging with the evaluation results to make recommendations for change

After data collection was complete, the evaluation volunteer entered and organized data for the evaluation team and education staff to review. The student programs manager and department leadership decided that the best strategy for engaging with the evaluation data was to discuss results and recommendations with museum educators over a series of meetings. An unintended benefit to this was that the evaluation stayed in the minds of leadership and staff, and continues to be a guide for program growth.

The first evaluation discussion began with an overview of findings presented by the evaluation team. After the “big-picture” presentation, the entire group discussed the results the educators found most interesting and questions about the evaluation outcomes, and began to articulate recommendations for program changes and professional development.

Participatory approach: Museum educators’ views shaped the evaluation recommendations and changes in program implementation

About a month later, at the fall education department meeting, the student programs manager chose two topics for deeper consideration. In the earlier meeting, museum educators focused on challenges with chaperone behavior in workshops. With the evaluation volunteer’s help, the student programs manager assembled observation and free-response data about effective and ineffective chaperone behavior. During the meeting, educators reviewed this evidence, discussed classroom management, and brainstormed ways to address chaperone challenges. Afterward, the student programs manager selected recommendations for managing chaperones and included those in the Field Trip Workshops lesson plans.

The education team also proposed a deeper dive into the challenges of teaching outdoor workshops, such as the variable conditions introduced by inclement weather or seasonal changes to habitats. In fact, evaluation findings showed a greater variability of experience in outdoor workshops compared to indoor ones. Together with the evaluation volunteer, the student programs manager compiled rating, observation, and free-response data from Habitat Explorers and Ecology Rangers. The education staff discussed the data and their experiences teaching outdoors. As a result, museum educators and student programs manager updated the outdoor workshops curriculum to include options for teaching in inclement weather and specific guidance about habitat content to observe and teach in different seasons.

Figure 5

Field Trip Workshop students use magnifying glasses to examine a turtle and record their observations in Animals Up Close.

 

Museum educators reflect on the impact of evaluation on their practice

As the evaluation drew to a close, the team asked museum educators to comment on their experiences participating in the evaluation and the ways it impacted their thinking and professional practice. Two educators’ insights are presented below.

Example 1

One museum educator, a two-year member of the education team with an M.A. in curriculum and instruction and an endorsement in museum education, wrote:

Having the opportunity to reflect directly following my implementation of a Field Trip Workshop with the evaluator was enriching to my teaching practice and allowed me to feel a sense of measurable growth. Especially when there was the opportunity to compare and contrast my various implementations of the same program, I felt growth in that I could identify goals with the evaluator and immediately assess if I had reached those goals.

For example, after teaching the workshop Metamorphosing Monarchs, I shared with the [evaluation volunteer] that in the next workshop, I wanted to limit my narration of a life cycle video to about five minutes in order to increase student engagement and [a lot] more time for exploration of specimens. Following the completion of the next workshop, we were able to directly access my progress towards that goal and more holistically discuss larger goals for my teaching practice.

Example 2

Another educator, a team member of over three years with a background in art history and an MA in museum education, explained:

I really enjoyed the overall experience with the evaluation and have found the results to be extremely helpful in informing my overall practice. [The evaluation volunteer] always made the process extremely easy for us—even during some of our busiest programming times. As an extremely self-reflective person, it was nice to spend a few minutes after each lesson that [the volunteer] observed reflecting on it … It allowed me to critically reflect on the lesson immediately after the program concluded while my thoughts were still fresh in my mind. I also feel like I benefited from rating each lesson on a scale since it pushed me to view each lesson [in a larger context], rather than getting lost in specific or smaller details or moments within [a workshop].

And now that the results, as it were, are in, it’s been especially helpful in informing my practice. I’ve found the direct feedback from [classroom] teachers, in particular, helpful as we often have very little, if any, direct contact with these teachers after the program is over. To be able to step back and see the big picture of our programs’ impact on our audiences has been very impactful and has made me begin to really think about how to help mindfully train our new hires with this all in mind.

Insights

This evaluation project was beneficial to the program for several reasons. First, the feedback received from the evaluation demonstrated how program goals were being met from a participant perspective. It also illuminated several patterns of experience where the team felt both the participant (student and teacher) and educator experience could be improved. These areas aligned with changes already on the minds of department leaders, helped provide additional evidence for specific improvements, and fostered support for those changes.

A valuable, if unintended, outcome of this evaluation’s participatory approach was the opportunity for staff to reflect on their teaching practice with the volunteer during data collection. Many team members found it helpful to reflect on their instruction immediately after a workshop and then make adjustments in the workshop that followed. This not only helped the staff holistically improve their teaching practice, but it also helped them become better supports for each other and more effective mentors to new team members.

Figure 6

As the rain clears, a group of Field Trip Workshop students make their way onto the fishing pier to observe the living and nonliving things in the wetland habitat.

 

Conclusions

Evaluation can be a powerful and important tool for building and maintaining effective out-of-school STEM programs. Evaluations need the support of both leadership and staff to be useful, but when done well they can yield insights and promote positive change for both programs and staff. Including staff in an evaluation—taking advantage of their knowledge of the program, instincts, questions, and contributions—yields a stronger project and a positive evaluation experience. Further, involving education staff in the design and direction of the evaluation means they place a higher value on the results.

Evaluation is often not a final destination, but rather a process of evolution for programs and their staff. Thus, creating positive evaluation experiences can help build a culture of evaluation. This is key to creating and sustaining excellent out-of-school STEM experiences.

 

Caroline Freitag (caroline.freitag@northwestern.edu) is an evaluation specialist at Northwestern University in Evanston, Illinois. Melissa Siska (msiska@naturemuseum.org) is the student programs manager at the Chicago Academy of Sciences/Peggy Notebaert Nature Museum in Chicago, Illinois.

References

Brisolara, S. 1998. The history of participatory evaluation and current debates in the field. New Directions for Evaluation (80): 25–41.

Center for Advancement of Informal Science Education (CAISE). 2011. Principal investigator’s guide: Managing evaluation in informal STEM education projects. http://informalscience.org/evaluation/evaluation-resources/pi-guide.

Cousins, J.B., and L.M. Earl. 1992. The case for participatory evaluation. Educational Evaluation and Policy Analysis 14 (4): 397–418.

Friedman, A.J. 2008. Framework for evaluating impacts of informal science education projects. Washington, DC: National Science Foundation. www.informalscience.org/framework-evaluating-impacts-informal-science-education-projects.

National Research Council (NRC). 2010. Surrounded by science: Learning science in informal environments. Washington, DC: National Academies Press.

www.nap.edu/catalog/12614/surrounded-by-science-learning-science-in-informal-environments.

National Research Council (NRC). 2015. Identifying and supporting productive STEM programs in out-of-school settings. Washington, DC: National Academies Press.

www.nap.edu/catalog/21740/identifying-and-supporting-productive-stem-programs-in-out-of-school-settings.

Patton, M.Q. 1997. Utilization-focused evaluation. 3rd ed. Thousand Oaks, CA: Sage.

 

 

Resources

Clavijo, K., M.L. Fleming, E.F. Hoermann, S.A. Toal, and K. Johnson. 2005. Evaluation use in nonformal education settings. New Directions for Evaluation (108): 47–55.

Hennigar Shue, J. 1982. Teaching yourself to teach with objects. Journal of Education 7 (4): 8–15.

Paris, S.G., and S.E. Hapgood. 2002. Children learning with objects in informal learning environments. In Perspectives on object-centered learning in museums, ed. S.G. Paris, 37–54). Mahwah, NJ: Lawrence Erlbaum Associates Publishers.

Learn how you can include museum educators in their own program evaluations to increase staff buy-in and engagement.
Learn how you can include museum educators in their own program evaluations to increase staff buy-in and engagement.
Subscribe to
Asset 2