Skip to main content
 

Next Gen Navigator

Helping Students to Argue from Evidence

Posted on 2019-09-17

 

Get students hooked on real-world science

By Claire Reinburg

Posted on 2019-09-16

When students see the real-world applications of science, their interest levels soar. Teachers are experts at guiding students through the wonders of how science helps us answer questions and engineering helps us solve problems. This month, stock up on new lessons from NSTA Press books that will spark student interest. You’ll be smiling as they marvel at the everyday applications that take science lessons beyond the classroom.

Explore Natural Hazards

Book cover image of "Natural Hazards"

The just-published STEM Road Map series volume Natural Hazards, Grade 2, will help elementary students learn about the effects of natural hazards on people, communities, and the environment and consider how threats to human safety from natural hazards can be minimized. Download the lesson “Let’s Explore Natural Hazards” and lead your students on an exploration of the types of natural hazards that occur around the world. Your students will learn that natural hazards can be classified as those with weather-related causes and those caused by Earth’s movements. With hurricane and tornado images and information in the news, students are curious about what causes these types of hazards and how their effects can be minimized. Check out all the topics covered in the growing STEM Road Map series, edited by Carla C. Johnson, Janet B. Walton, and Erin Peters-Burton.

Ask Students to Study What Makes a Great Playground

Book cover image of "It's Still Debatable"

At recess time, students can bring their studies of physical science down to earth with an engaging lesson from Sami Kahn’s new book It’s Still Debatable! Using Socioscientific Issues to Develop Scientific Literacy, K–5. Download the lesson “Swingy Thingy: What Makes a Great Playground?” to launch your elementary students on a study of playground swings. They’ll get to model swings as they apply their knowledge of forces to support arguments for safe and enjoyable playground design. Along the way, students develop blueprints and work collaboratively to design a dream playground. Other lessons in this new book prompt students to investigate questions like “Do we need zoos?”, “Which alternative energies are best?”, and “Is football too dangerous for kids?” Explore even more key questions using lessons from the first book in this series, It’s Debatable! Using Socioscientific Issues to Develop Scientific Literacy, K–12, by Dana Zeidler and Sami Kahn.

Explore Phenomena That Sparked Engineering Innovations

Focus on innovations sparked by accidental observations with the 22 easy-to-use investigations in Discovery Engineering in Physical Science: Case Studies for Grades 6–12, by M. Gail Jones, Elysa Corin, Megan Ennes, Emily Cayton, and Gina Childers. Your middle and high school students will use real-world case studies to embark on investigations of actual scientific discoveries that began with observations of phenomena and led to inspired inventions and applications. Download the lesson “A Sticky Situation: Gecko Feet Adhesives” to have your students explore how studying the mystery of geckos’ ability to hang by even one toe from a piece of glass led scientists to make an adhesive that mimics the properties of a gecko foot. Other case studies in the book focus on topics from shark skin and bacteria to the history of the Slinky. Check out the next volume in the series that’s coming soon, Discovery Engineering in Biology: Case Studies for Grades 6–12.

Explore NSTA Press’s Newest Books and Save on Shipping

Through October 31, 2019, get free shipping on your order of $75 or more of NSTA Press or NSTA Kids books when you use promo code SHIP19 in the online Science Store. Stock up on books that cover all grade ranges and span key topics like climate change, engineering, physical science, and life science. Browse the catalog online or view the new and forthcoming books. Offer applies to new purchases made online through the NSTA Science Store between now and October 31, 2019.

When students see the real-world applications of science, their interest levels soar. Teachers are experts at guiding students through the wonders of how science helps us answer questions and engineering helps us solve problems. This month, stock up on new lessons from NSTA Press books that will spark student interest. You’ll be smiling as they marvel at the everyday applications that take science lessons beyond the classroom.

Explore Natural Hazards

Picture-Perfect Science Online Course, Jan 15

A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
A Picture-Perfect Science Online Course includes:
  • 10 hours of live and/or pre-recorded training using Zoom Video Conferencing
  • 3 two-hour sessions with the authors and 2 two-hour sessions with a trained facilitator
  • 1 ebook of choice from either Picture-Perfect Science STEM Lessons K–2 or Picture-Perfect Science STEM Lessons 3–5
  • A digital learning packet containing the first 5 chapters of Picture-Perfect Science Lessons, lessons modeled during the webinars, and relevant articles
  • Graduate credit if purchased separately - information will
 

Launching the PocketLab Voyager

By Edwin P. Christmann

Posted on 2019-09-13

Intro

Exploring motion, light, temperature, altitude, and magnetic fields can be taken to new interactive heights with the PocketLab Voyager by Myriad Sensors.  Subsequently, by using a wireless sensor, the PocketLab Voyager records and stores data that can be shared with the free PocketLab “app.”

Once coupled, both the “app” and PocketLab device work together to create a variety of  experiments.  Hence, the PocketLab Voyager is designed for users as young as fourth grade; yet, sophisticated enough for engineers, mathematicians, and scientists. 

The PocketLab Voyager operates on a wireless Bluetooth 4.0 connection. Therefore, users are able to collect data using their smartphones, tablets, and computers, which makes accessibility of the PocketLab device simple. Additionally, the PocketLab Voyager has the ability to integrate data with Scratch, Google Drive, and Microsoft Excel. Once integrated,  exploring motion, light, temperature, altitude, and magnetic fields can be taken to new interactive heights for recording and storing data. 

Image 1: The PocketLab Voyager 

How to Use

Before using the PocketLab Voyager for the first time, it is essential to make sure that the battery is fully charged. To charge the battery, use the orange micro USB cord that is included with the purchase of the PocketLab Voyager. Doing this will take approximately 60 minutes to fully charge. Once fully charged, a red light will stop blinking on the front of the device. After verify that the device is charged, users can follow the PocketLab Voyager Getting Started Guide in order to download the free “app” to connect the PocketLab sensor to their chosen device. A hardcopy of the instructions is included with the purchase of a sensor or can it be accessed on PocketLab’s website at https://drive.google.com/file/d/1Ds4fzhNVG1RRf4xrKrzQeVW45ktxCto6/view.

We found that the instructions were easy to navigate and the device was ready to collect data within 10 to 15 seconds of pairing.   Moreover, after the set-up with a compatible device, users are prompted and grant access to the camera and microphone.  Once this is accomplished, PocketLab can record a video or capture a graph, combining the collected data in real time. Furthermore, enabling access to the camera and microphone allows users to “record up to 30,000 measurements to the on-board memory,”  which enables users to toggle between sensors, change the points/second feature,  move between units of measurement, and compare up to three sensors at a time.

Video 1: Getting Started with the PocketLab Mobile App

Once comfortable with the “app” and the settings preference it time to begin experimentation! With every purchase, PocketLab includes a series of nine getting started activity cards that provide information about topics such as barometric pressure, gyroscope, and velocity. These activity cards outline how the PocketLab sensor can assist users to record meaningful data for experimentation.  For assistance, users can always refer to the instruction manual tab located at: https://www.thepocketlab.com/educators/resources.

Image 2: PocketLab Voyager Instruction Manuals

The Rangefinder

An interesting feature of the PocketLab Voyager’s is the Rangefinder, which makes it possible to gather data through infrared technology. To use it, it’s necessary to open the PocketLab “app” and pair their device with the sensor. Once the devices are paired, the user can then turn on the rangefinder to display the position and velocity graph.

As the PocketLab Voyager moves, the infrared sensor calculates the Voyager’s “distance from the surface which is then used to calculate its velocity.” Purchased separately, the PocketLab device can also be attached to a cart, which will enable the user to calculate kinetic energy, explore energy transfer, and understand how changes in momentum effect the results of the study. Check out the tutorial below for more details!

Video 2: Using PocketLab Voyager’s Rangefinder

 What’s Included

· 1 PocketLab Voyager

· 1 Protective Carrying Case

· 1 Set of Getting Started Activity Cards

· 100+ Lessons and Activities

· Micro USB Charging Cable

What Needs Purchased Separately

· Temperature Probe ($9)

· Tactile Pressure Sensor ($27)

· Silicone Protective Case ($10)

· Classroom Set Case and Charger ($58)

· Five Port USB Charger ($19)

Classroom Uses

For planning units and lessons, the PocketLab website provides a myriad of resources for educators to use in their classrooms. Educators will quickly find that Myriad Sensors offer four different sensors. In addition, various classroom kits, sensor accessories, and advanced STEM kits are available for use in conjunction with any PocketLab device.

Teachers also have access to a lesson plan directory listing hundreds of lessons for students of all ages. Additionally, educators can post their own ideas and classroom experiences with their PocketLab devices on a community blog which connects educators from around the globe.

Check out the links below for immediate access to hundreds of classroom resources! –https://www.thepocketlab.com/educators/lesson-plan-directoryhttps://www.thepocketlab.com/educators

Tips for Getting Started

Before working with a new PocketLab device, be sure to check out the PocketLab website and review the plethora of resources available. Once there, you will see multiple video resources that will help you get the most out of your sensor. Follow this link to gain access to PocketLab’s how-to videos: https://www.thepocketlab.com/educators/resources.

From our experience, we have found that the PocketLab Voyager to be the perfect interactive tool to engage students in ways that transform the way they view science to be an interactive observational tool for the world around them.

Specifications

· Wireless Connection: Bluetooth 4.0

· Battery: Rechargeable via micro USB

· Battery Life: 8 hours (wireless, full data rate) 12 hours (low power, logging mode)

· Wireless Range: 250 feet line-of-sight

· Memory: 30,000 data readings

· Durability: 2 m (6 ft) drop protection

· Dimensions: 3.8 x 3.8 x 1.5 cm (1.5 x 1.5 x 0.6 in)

· Weight: 17 g (0.6 oz)

* For specific sensor specifications, check out the following link! https://www.thepocketlab.com/specs

Cost

$148.00

About the Authors

Edwin P. Christmann is a professor and chairman of the secondary education department and graduate coordinator of the mathematics and science teaching program at Slippery Rock University in Slippery Rock, Pennsylvania. Marie Ellis is a graduate student at Slippery Rock University in Slippery Rock, Pennsylvania.

Intro

Exploring motion, light, temperature, altitude, and magnetic fields can be taken to new interactive heights with the PocketLab Voyager by Myriad Sensors.  Subsequently, by using a wireless sensor, the PocketLab Voyager records and stores data that can be shared with the free PocketLab “app.”

 

Labs in Life Sciences

By Gabe Kraljevic

Posted on 2019-09-13

I am a preservice biology teacher and was hoping to get some insight on labs. What are some of your favorite labs that you have done with your class and what made them a success? How do you typically assess labs?

—D., Virginia

I believe biology becomes much more intriguing for students by incorporating as many hands-on activities as possible.

Not all labs require formal lab reports or quantitative analysis. I feel it is good to vary your assessments to suit the activity. Quick observation labs can be assessed with worksheets. Long-term projects are well-suited to journaling. Paragraph answers foster higher-order thinking skills, promote literacy, and challenge students to reach conclusions and connect topics. Presentations are good assessments.

My favorite hands-on biology activities are:

DNA extraction: Isolate DNA from crushed strawberries or other fruit. Students are amazed at how much DNA can be extracted.

Pop bottle ecosystems: Students are motivated to observe the terraria they created themselves. Peruse my collection in the Learning Center at http://bit.ly/PopBottleEcosystems

Observing living organisms: Start seeds to explore life-cycles, requirements for growth, tropisms, and more. Insects, worms, and pond aquariums are fun. (I discussed this in more detail previously. You can read the blog post at http://bit.ly/2KTDA22.)

Dissections
Dissections can be fascinating and useful for discussing the ethical, scientific use of animals. Molluscs, fish, and crabs can be purchased from grocery stores. Fetal pigs can be particularly impactful. Don’t forget to include plants, flowers, fruits and vegetables. (Check with your department head or administration and be aware of cultural issues.)

Ecological studies
Students can conduct field surveys of the school yard with small quadrats made from soda-straws. Run transects, identify species, estimate biomass, and write reports on what was discovered.

Hope this helps!

Image by OpenClipart-Vectors from Pixabay

I am a preservice biology teacher and was hoping to get some insight on labs. What are some of your favorite labs that you have done with your class and what made them a success? How do you typically assess labs?

—D., Virginia

I believe biology becomes much more intriguing for students by incorporating as many hands-on activities as possible.

 

Feature

Evaluating Nature Museum Field Trip Workshops, an Out-of-School STEM Education Program

Using a Participatory Approach to Engage Museum Educators in Evaluation for Program Development and Professional Reflection

Connected Science Learning July-September 2019 (Volume 1, Issue 11)

By Caroline Freitag, and Melissa Siska

Evaluating Nature Museum Field Trip Workshops, an Out-of-School STEM Education Program

Out-of-school learning opportunities for students in science, technology, engineering, and mathematics (STEM) represent an important context in which students learn about and do STEM. Studies have shown that in- and out-of-school experiences work synergistically to impact learning. New education guidelines, particularly the Next Generation Science Standards (NGSS), place a significant focus on increasing students’ competence and confidence in STEM fields, and note that “STEM in out-of-school programs can be an important lever for implementing comprehensive and lasting improvements in STEM education” (NRC 2015, p. 8).

Figure 1

A museum educator holds a seed specimen for students to observe during A Seed’s Journey. In this workshop, students discover different methods of seed dispersal and participate in an investigation to determine the seed dispersal methods employed by a variety of local plants.

 

Effective out-of-school STEM programs meet specific criteria, such as engaging young people intellectually, socially, and emotionally; responding to young people’s interests, experiences, and culture; and connecting STEM learning across contexts. Effective programs also employ evaluation as an important strategy for program improvement and understanding how a program promotes STEM learning (NRC 2015). In the past decade, the National Research Council (2010) and the National Science Foundation (Friedman 2008) have published materials citing the importance of rigorous evaluation and outlining strategies for implementing evaluations of informal science education programs. The Center for Advancement of Informal Science Education specifically targeted their 2011 guide toward program leadership to empower them to manage evaluations and work with professional evaluators.

In this article, a team of collaborators from the education department of the Chicago Academy of Sciences/Peggy Notebaert Nature Museum (CAS/PNNM) share the outcomes of a recent evaluation of Field Trip Workshops, a STEM education program in which school groups visit the Nature Museum. The purpose of the evaluation was to collect data about the program’s impacts and audience experiences as a foundation for an iterative process of reflection and program development. We hope this case study provides ideas for other institutions.

Field Trip Workshops program

School field trips to cultural institutions provide rich opportunities for out-of-classroom learning. At CAS/PNNM, school groups can register for Field Trip Workshops to enhance their museum visit. During the 2017–2018 academic year, educators at the Nature Museum taught 900 Field Trip Workshops totaling over 23,500 instructional contact hours with classroom teachers and students. Field Trip Workshops account for approximately one-fifth of the total contact hours between the education department and Chicago schools and communities, making this program the department’s largest.

Program structure

Field Trip Workshops are one-time, 45-minute programs taught by the museum’s full-time, trained educators, who use museum resources to enrich student learning. These hands-on, inquiry-based programs engage students with the museum’s collections and specially cultivated habitats on the museum grounds. Workshops connect to NGSS science content and practices while remaining true to the museum’s mission: to create positive relationships between people and nature through local connections to the natural world. Field Trip Workshops are designed for students in prekindergarten through 12th grade, with both indoor and outdoor workshops offered at each level. These workshops are an opportunity for students to learn science content, engage in science practices, and create personal connections to local nature through interactions with museum resources. Workshops highlight the museum’s living animals, preserved specimens, and outdoor habitats, and often discuss the conservation work done by the museum’s scientists.

All workshops emphasize students’ interactions with the museum’s living and preserved collections or outdoor habitats. Workshops are offered for different grade bands and correlate to the Next Generation Science Standards’ disciplinary core ideas (DCIs) and science and engineering practices (SEPs).

Education team

The education staff of the Nature Museum come from a variety of professional backgrounds. Several members hold advanced degrees in science, instruction, or museum education. Staff have experience in informal education settings and/or are former classroom teachers. Their diverse training and skills create a team with science content and pedagogical expertise.

Figure 2

Field Trip Workshop students in Exploring Butterfly Conservation inspect a case of mounted butterfly specimens while discussing their prior knowledge of common Illinois butterflies. Students are then introduced to two rare butterflies for deeper investigation.

 

Planning the evaluation

Setting the stage

Basic evaluation is a regular part of growing and developing the Field Trip Workshops program. Museum educators participate in team reflections, revise curriculum, and review workshop alignment with overall program goals on an ongoing basis. Because Field Trip Workshops are a fee-based program without a grant budget or a funder to report to, there had never been sufficient time or resources to support a comprehensive, mixed-methods evaluation. Still, the desire to more fully evaluate the program was present in the minds of the Nature Museum’s education department leadership and program staff. Thus, the in-depth evaluation of the Field Trip Workshops program reported here was an unprecedented opportunity.

In 2014 the Nature Museum developed and implemented an organization-wide 10-year plan. Evaluation of all programs is a key component of the education department’s plan. During a retreat in spring 2017, the staff reflected on their department’s work in achieving these goals and agreed that Field Trip Workshops needed detailed evaluation to grow and refine the program. Team members observed that, although staff worked diligently to align each workshop with the program’s goals, they had never obtained large-scale feedback from program participants to see whether those goals were truly being met.

An evaluation opportunity and its challenges

In the spring of 2019, a new museum volunteer with experience in research expressed an interest in out-of-school STEM experiences and learning more about the museum’s education programs. After careful consideration, education department leadership and the student programs manager chose to seize the opportunity, and an evaluation team including the student programs manager, the volunteer, and two members of department leadership was formed.

The education staff was interested in contributing to an evaluation, but it was clear that staff time would be limited. Thus, the team realized that achievable evaluation objectives (i.e., choosing only the most popular workshops to evaluate rather than the entire suite, setting a six-week data collection time limit) and staff buy-in (i.e., focusing on staff questions/priorities) would be required for the evaluation to succeed. The evaluation team kept a careful eye on the evaluation’s impact on education staff, asking them to be upfront about any concerns (e.g., disruptions to the normal schedule or difficulties with the evaluation/evaluator) and standing ready to make adjustments as necessary (e.g., rescope or end the evaluation).

Collaborating with experienced volunteers to facilitate the evaluation

The program evaluation was greater in scope than would have been possible without additional support while having a minimal imposition on staff members and programming because a museum volunteer served as the primary evaluation facilitator. The volunteer had research experience in biology and psychology, but was making a career change toward STEM education, and thus was particularly interested in learning more about informal education and program evaluation. As a result, the collaboration gave the volunteer an opportunity to learn and develop new skills, while the museum benefited from the volunteer’s research experience.

Practical consideration: Finding volunteers

Finding volunteers with enough background to support evaluation can be challenging, and all volunteers’ skills, interests, and outlooks should fit the organization and the project. Volunteers may want to contribute to evaluation efforts due to interest in the project, support for the institution or program, or to gain experience. For example, graduate students or senior undergraduates may be interested in facilitating evaluation or collaborating on a larger project. Additionally, retired professionals or part-time workers may have the necessary expertise and flexibility to donate their time to a project. Depending on background, skills, and time commitment, a volunteer may be able to help recruit participants, conduct interviews, distribute surveys, follow up with participants, and enter and visualize data. Volunteers may also help with tasks such as keeping records, scheduling meetings, and providing support for staff members.

Choosing a participatory evaluation approach

Participatory evaluation is defined as a collaboration between evaluators and those who are responsible for a program and will use the results of the evaluation. Participatory evaluation involves these people in the fundamental design, implementation, and interpretation of the evaluation. While the trained evaluator facilitates the project, the evaluation is conducted together with program staff (Cousins and Earl 1992). Participatory evaluation’s theoretical underpinnings rely on the idea that evaluation is only effective if results are used (Brisolara 1998). Utilization-focused evaluation further argues that evaluation should be useful to its intended users (Patton 1997), and usefulness can be limited if the results are not suited to those users (e.g., evaluation questions are not relevant or reports are not tailored to the audience). Participatory evaluation seeks to encourage use of evaluation results by involving those who are most interested in the outcomes in conducting the evaluation itself.

The Nature Museum’s education department was the primary intended user of this evaluation, as they implement the program and play a vital role in its development. The intent of the evaluation was to generate information that would aid in improving the program. Although this evaluation project did not use a professional evaluator, the volunteer’s research background was useful in designing the study. Education department leadership was also familiar with evaluation basics from collaborating with and training under the research and evaluation consultancy that conducts evaluations for the department’s grant-funded programs.

The evaluation team took inspiration from the participatory evaluation model specifically because it was a good fit for the education department culture. Museum educators already contribute to defining directions for programs and recommend curricular improvements. Thus, it made sense for these educators to contribute their insights to evaluation as well. Most importantly, educators’ involvement in evaluation gave them information about their highest priorities: identifying patterns of success, challenges in program implementation, and key characteristics of effective interactions with students.

Participatory approach: Museum educators’ experiences were an evaluation focus

At the start of the evaluation, the student programs manager held a meeting for all museum educators to discuss priorities for evaluating Field Trip Workshops. Together, they identified questions about student experiences and questions to ask of their classroom teachers. The educators also described factors that made them “feel like I taught a good workshop” and those that made a workshop feel less successful.

The evaluation team realized educator viewpoints would provide a critical third perspective alongside student and classroom teacher experiences, and that key insights might lie in the similarities and differences between how museum educators, students, and their classroom teachers perceived the same workshop.

Figure 3

Students gather at the sides of the walking path in the prairie to make scientific drawings of their observations of the habitat, while a museum educator discusses local plants and animals.

 
Participatory approach: Museum educators’ questions steered the evaluation direction

After the initial staff meeting, the evaluation team reviewed the educators’ evaluation questions and comments. They then organized the questions, reviewed possible methods for answering the questions, and considered the practicality of collecting data. Ultimately, the questions of highest priority that also met the practical limitations of data collection were selected. The remaining questions were set aside for a future opportunity.

The education staff and evaluation team articulated three evaluation questions:

  1. To what extent were classroom teachers and museum educators satisfied with their Field Trip Workshop, and to what degree do they think student needs were met?
  2. In what ways do Field Trip Workshops impact students during and after their experience, specifically (a) student engagement with and attitudes toward science and nature, (b) their science content and practice learning, and (c) the types of personal connections students made to science and nature (e.g., “I have a pet turtle like that!” or “We’re raising butterflies in class”)?
  3. In what ways do Field Trip Workshops connect to in-school science (e.g., the school science curriculum, the workshop’s relevance to in-school work, similarities and differences in the types of learning experiences in the workshop compared to classroom)?

Conducting the evaluation

There were several practical considerations that affected the scope of the evaluation plan.

Practical considerations: Museum educators’ schedule impacted the evaluation scope

The evaluation team decided to include only the five most popular Field Trip Workshops in the evaluation. These workshop topics are taught frequently enough that sufficient data could be collected: 10–12 workshop sessions in each topic during the six-week data collection period, which avoided the complication of carrying the project over into the next academic year. In total, the evaluation included 13 museum educators, approximately 60 classroom teachers, and about 1,500 students.

The evaluation team chose data collection methods to fit not only the evaluation questions, but also the program schedule, staff responsibilities, and evaluation time frame. First, the team decided to have the evaluation volunteer sit in and observe workshops. Together, the evaluation team developed an observation protocol so that the volunteer could take notes about student learning and record student comments about personal connections to science and nature.

Then, the team wrote a short series of follow-up questions for museum educators to answer after each observed workshop. These included rating and free-response questions about their satisfaction, perception of student learning, and personal connections students shared with staff.

Practical considerations: Museum educators’ responsibilities affected the evaluation plan

The 45-minute workshops are scheduled back-to-back with a 15-minute “passing period,” during which museum educators reset classrooms for the next workshop. In the end, it was decided that the volunteer would ask educators follow-up questions in an interview. This way, educators could prepare for their next group and participate in the evaluation. After the interview, the volunteer helped educators prepare for the next workshop.

Finally, the evaluation team developed a follow-up survey for classroom teachers. To streamline the process, the team put together clipboards with a brief request-for-feedback statement and printed sign-up sheets for classroom teacher contact information for education staff to use when approaching teachers. The evaluation team made sure that clipboards were prepared each day, rather than asking museum educators to organize survey materials in addition to their regular workshop preparation.

Program evaluation findings

The Field Trip Workshop evaluation explored three questions (listed above) by collecting data from students, their classroom teachers, and museum educators.

Overall, classroom teachers reported they were “very satisfied” (75%) with their workshops, largely because they thought the workshops and the museum educators who led them did a good job of meeting students’ needs. Classroom teachers attributed their sense of satisfaction to their assessment of the staff’s teaching, the workshop curriculum, students’ learning and engagement, and their overall experience at the Nature Museum.

Museum educators were usually less satisfied than classroom teachers. Staff linked their satisfaction to perception of student experience, student learning outcomes, and reflections on lesson implementation. Staff often attributed their modest satisfaction to perceived imperfections in implementation and to a high standard for student experience and learning. Positive teacher feedback encouraged museum educators to view their own work in an optimistic light, and they found it valuable to learn more about the factors underlying teacher satisfaction.

The second evaluation question asked in what ways students were impacted by their workshop both on the day and at school in the days/weeks following. Nearly three-quarters of classroom teachers reported their students’ engagement was “very high” during the workshop, and along with staff and volunteer observations, commented that engaging with the museum’s collections and habitats was a novel type of learning experience for many students.

After their Field Trip Workshop, teachers observed that in class and during free time, their students demonstrated improved awareness of living things and increased interest and curiosity. Classroom teachers also reported that student attitudes toward science and nature improved, a view shared by most museum educators. The education staff credited this change to reduced anxiety with living things/the outdoors, increased connection to living things, and growing confidence in new environments. The education team found it validating to see that improvements in student attitudes toward science and nature were attributed to object-based learning with the museum’s unique resources.

The evaluation also explored workshop impacts on student learning and practicing of science. Although classroom teachers said their students typically learned the science content and practices explored in their workshop “excellently” or “well,” museum staff held a slightly different view. Museum educators reported reduced student learning in many Bugs Alive! workshops and some outdoor workshops (Habitat Explorers, Ecology Rangers). A deeper exploration of these workshops revealed (1) the difficulties staff experienced in promoting effective chaperone behavior and (2) the challenges of teaching outdoors under variable conditions. These findings served as the basis for immediate reflection, group discussion, and program improvements.

The evaluation also asked classroom teachers and museum education staff what kinds of personal connections to nature and science students made during their workshops. Students made many personal connections as evidenced by their comments to classroom teachers, classmates, museum educators, and the evaluation volunteer. Analysis showed that students make a variety of internal and external connections to science and nature through workshops. Internal connections are those that link workshop experiences to students’ internal worlds, including prior emotional and sensory experiences or things the student knows and can do (such as: “I’m thinking of being a zoologist when I grow up” “It [wild bergamot] smells like tea!” “I had a goose. It got caught, and we rescued it”). External connections are those that students make between the workshop and people, places, and things (such as: “We had a lizard and two turtles in our class” “My group [project] at school is [on] woodlands” “I have one of those [flowering] trees, it’s similar, at my house”). Although museum educators occasionally hear personal connections from students, they were delighted by the variety and depth of the student statements captured by the volunteer and reported in the evaluation.

The last question focused on connections between in-school science and Field Trip Workshops. Classroom teachers reported contextualizing their field trip within their current science curriculum (unit kickoff, in-unit, unit-capping activity) most of the time. Only one in six workshops was a stand-alone experience (i.e., not explicitly connected to the science curriculum). Teachers also said they valued their workshop experience because it differed from classroom work by using resources they usually could not access: living things, preserved specimens, and outdoor habitats. Further, classroom teachers were glad their students were engaged in doing science: exploring, observing, and recording. Teachers were also pleased with workshop elements’ similarities to the classroom, including familiar teaching methods used by the trained museum educators. The education staff was happy that their prior anecdotal observations regarding close connections between their workshops and what students experience in school were confirmed by evidence.

Figure 4

On the Nature Walk during a stormy day, Habitat Explorers students make and record observations of the woodland habitat with guidance from a museum educator. Students take these “nature journals” home to share or to school for further development.

 

Engaging with the evaluation results to make recommendations for change

After data collection was complete, the evaluation volunteer entered and organized data for the evaluation team and education staff to review. The student programs manager and department leadership decided that the best strategy for engaging with the evaluation data was to discuss results and recommendations with museum educators over a series of meetings. An unintended benefit to this was that the evaluation stayed in the minds of leadership and staff, and continues to be a guide for program growth.

The first evaluation discussion began with an overview of findings presented by the evaluation team. After the “big-picture” presentation, the entire group discussed the results the educators found most interesting and questions about the evaluation outcomes, and began to articulate recommendations for program changes and professional development.

Participatory approach: Museum educators’ views shaped the evaluation recommendations and changes in program implementation

About a month later, at the fall education department meeting, the student programs manager chose two topics for deeper consideration. In the earlier meeting, museum educators focused on challenges with chaperone behavior in workshops. With the evaluation volunteer’s help, the student programs manager assembled observation and free-response data about effective and ineffective chaperone behavior. During the meeting, educators reviewed this evidence, discussed classroom management, and brainstormed ways to address chaperone challenges. Afterward, the student programs manager selected recommendations for managing chaperones and included those in the Field Trip Workshops lesson plans.

The education team also proposed a deeper dive into the challenges of teaching outdoor workshops, such as the variable conditions introduced by inclement weather or seasonal changes to habitats. In fact, evaluation findings showed a greater variability of experience in outdoor workshops compared to indoor ones. Together with the evaluation volunteer, the student programs manager compiled rating, observation, and free-response data from Habitat Explorers and Ecology Rangers. The education staff discussed the data and their experiences teaching outdoors. As a result, museum educators and student programs manager updated the outdoor workshops curriculum to include options for teaching in inclement weather and specific guidance about habitat content to observe and teach in different seasons.

Figure 5

Field Trip Workshop students use magnifying glasses to examine a turtle and record their observations in Animals Up Close.

 

Museum educators reflect on the impact of evaluation on their practice

As the evaluation drew to a close, the team asked museum educators to comment on their experiences participating in the evaluation and the ways it impacted their thinking and professional practice. Two educators’ insights are presented below.

Example 1

One museum educator, a two-year member of the education team with an M.A. in curriculum and instruction and an endorsement in museum education, wrote:

Having the opportunity to reflect directly following my implementation of a Field Trip Workshop with the evaluator was enriching to my teaching practice and allowed me to feel a sense of measurable growth. Especially when there was the opportunity to compare and contrast my various implementations of the same program, I felt growth in that I could identify goals with the evaluator and immediately assess if I had reached those goals.

For example, after teaching the workshop Metamorphosing Monarchs, I shared with the [evaluation volunteer] that in the next workshop, I wanted to limit my narration of a life cycle video to about five minutes in order to increase student engagement and [a lot] more time for exploration of specimens. Following the completion of the next workshop, we were able to directly access my progress towards that goal and more holistically discuss larger goals for my teaching practice.

Example 2

Another educator, a team member of over three years with a background in art history and an MA in museum education, explained:

I really enjoyed the overall experience with the evaluation and have found the results to be extremely helpful in informing my overall practice. [The evaluation volunteer] always made the process extremely easy for us—even during some of our busiest programming times. As an extremely self-reflective person, it was nice to spend a few minutes after each lesson that [the volunteer] observed reflecting on it … It allowed me to critically reflect on the lesson immediately after the program concluded while my thoughts were still fresh in my mind. I also feel like I benefited from rating each lesson on a scale since it pushed me to view each lesson [in a larger context], rather than getting lost in specific or smaller details or moments within [a workshop].

And now that the results, as it were, are in, it’s been especially helpful in informing my practice. I’ve found the direct feedback from [classroom] teachers, in particular, helpful as we often have very little, if any, direct contact with these teachers after the program is over. To be able to step back and see the big picture of our programs’ impact on our audiences has been very impactful and has made me begin to really think about how to help mindfully train our new hires with this all in mind.

Insights

This evaluation project was beneficial to the program for several reasons. First, the feedback received from the evaluation demonstrated how program goals were being met from a participant perspective. It also illuminated several patterns of experience where the team felt both the participant (student and teacher) and educator experience could be improved. These areas aligned with changes already on the minds of department leaders, helped provide additional evidence for specific improvements, and fostered support for those changes.

A valuable, if unintended, outcome of this evaluation’s participatory approach was the opportunity for staff to reflect on their teaching practice with the volunteer during data collection. Many team members found it helpful to reflect on their instruction immediately after a workshop and then make adjustments in the workshop that followed. This not only helped the staff holistically improve their teaching practice, but it also helped them become better supports for each other and more effective mentors to new team members.

Figure 6

As the rain clears, a group of Field Trip Workshop students make their way onto the fishing pier to observe the living and nonliving things in the wetland habitat.

 

Conclusions

Evaluation can be a powerful and important tool for building and maintaining effective out-of-school STEM programs. Evaluations need the support of both leadership and staff to be useful, but when done well they can yield insights and promote positive change for both programs and staff. Including staff in an evaluation—taking advantage of their knowledge of the program, instincts, questions, and contributions—yields a stronger project and a positive evaluation experience. Further, involving education staff in the design and direction of the evaluation means they place a higher value on the results.

Evaluation is often not a final destination, but rather a process of evolution for programs and their staff. Thus, creating positive evaluation experiences can help build a culture of evaluation. This is key to creating and sustaining excellent out-of-school STEM experiences.

 

Caroline Freitag (caroline.freitag@northwestern.edu) is an evaluation specialist at Northwestern University in Evanston, Illinois. Melissa Siska (msiska@naturemuseum.org) is the student programs manager at the Chicago Academy of Sciences/Peggy Notebaert Nature Museum in Chicago, Illinois.

References

Brisolara, S. 1998. The history of participatory evaluation and current debates in the field. New Directions for Evaluation (80): 25–41.

Center for Advancement of Informal Science Education (CAISE). 2011. Principal investigator’s guide: Managing evaluation in informal STEM education projects. http://informalscience.org/evaluation/evaluation-resources/pi-guide.

Cousins, J.B., and L.M. Earl. 1992. The case for participatory evaluation. Educational Evaluation and Policy Analysis 14 (4): 397–418.

Friedman, A.J. 2008. Framework for evaluating impacts of informal science education projects. Washington, DC: National Science Foundation. www.informalscience.org/framework-evaluating-impacts-informal-science-education-projects.

National Research Council (NRC). 2010. Surrounded by science: Learning science in informal environments. Washington, DC: National Academies Press.

www.nap.edu/catalog/12614/surrounded-by-science-learning-science-in-informal-environments.

National Research Council (NRC). 2015. Identifying and supporting productive STEM programs in out-of-school settings. Washington, DC: National Academies Press.

www.nap.edu/catalog/21740/identifying-and-supporting-productive-stem-programs-in-out-of-school-settings.

Patton, M.Q. 1997. Utilization-focused evaluation. 3rd ed. Thousand Oaks, CA: Sage.

 

 

Resources

Clavijo, K., M.L. Fleming, E.F. Hoermann, S.A. Toal, and K. Johnson. 2005. Evaluation use in nonformal education settings. New Directions for Evaluation (108): 47–55.

Hennigar Shue, J. 1982. Teaching yourself to teach with objects. Journal of Education 7 (4): 8–15.

Paris, S.G., and S.E. Hapgood. 2002. Children learning with objects in informal learning environments. In Perspectives on object-centered learning in museums, ed. S.G. Paris, 37–54). Mahwah, NJ: Lawrence Erlbaum Associates Publishers.

Learn how you can include museum educators in their own program evaluations to increase staff buy-in and engagement.
Learn how you can include museum educators in their own program evaluations to increase staff buy-in and engagement.

Science and Engineering Practices: Professional Book Study for K-12 Teachers

Are you a K-12 school teacher working to enhance your knowledge and understanding of the Science and Engineering Practices from A Framework for K-12 Science Education and the Next Generation Science Standards (NGSS)? Register to participate in the Science and Engineering Practices: Professional Book Study, taking place in January-February, 2020!

Are you a K-12 school teacher working to enhance your knowledge and understanding of the Science and Engineering Practices from A Framework for K-12 Science Education and the Next Generation Science Standards (NGSS)? Register to participate in the Science and Engineering Practices: Professional Book Study, taking place in January-February, 2020!

Are you a K-12 school teacher working to enhance your knowledge and understanding of the Science and Engineering Practices from A Framework for K-12 Science Education and the Next Generation Science Standards (NGSS)? Register to participate in the Science and Engineering Practices: Professional Book Study, taking place in January-February, 2020!

Are you a K-12 school teacher working to enhance your knowledge and understanding of the Science and Engineering Practices from A Framework for K-12 Science Education and the Next Generation Science Standards (NGSS)? Register to participate in the Science and Engineering Practices: Professional Book Study, taking place in January-February, 2020!

 

Why aren’t you talking?!

By Gabe Kraljevic

Posted on 2019-09-06

I’ve been having trouble getting students willing to talk, answer questions, or share their ideas in class. What strategies/activities do you use to help kids feel more comfortable talking and sharing in your class?
—C., Arizona

There are few things worse than looking at a group of quiet students who you really want to participate! I addressed how to get discussions started in a previous blog (http://bit.ly/322RAfK ) but it’s probably good to revisit this common bane of teachers.

Start the lesson with an impressive demonstration or engaging video to stir up interest. Many students that are shy may just need some more time to digest and formulate ideas and questions before they’re comfortable talking with others. Graphic organizers allow students to think and write on their own so they have fuel for the discussions to follow. Some graphic organizers may be found in this collection in the Learning Center: http://bit.ly/30BgZwO

With their notes in hand, you can set up learning circles for larger groups to discuss topics. Set up rules on how they need to work: always stop at each and every person and always in the same direction – never skipping or reversing.

Hand-held white boards are excellent tools to get students involved. Every student or pair of students writes down short answers, sketches graphs, or indicates their understanding for quick feedback without having to talk in front of classmates. As students raise their boards, simply point, give a quick nod or simple feedback to help them figure out the correct answer. Don’t move on to the next question until everyone has successfully participated. This activity can be exciting and fast-paced, so don’t have too much down time and keep it moving along with a series of prepared questions. You can even invite students to question the class and give their own feedback.

Hope this helps!

Image by OpenClipart-Vectors from Pixabay

I’ve been having trouble getting students willing to talk, answer questions, or share their ideas in class. What strategies/activities do you use to help kids feel more comfortable talking and sharing in your class?
—C., Arizona

 

Educating Students About Aerospace Careers

By Debra Shapiro

Posted on 2019-09-03

Students in William Ervin’s aerospace class at Dubiski Career High School in Grand Prairie, Texas, gather around the flight simulator used in the class.

Cindy Hasselbring reads from the Boeing Pilot and Technician Outlook report: “804,000 new civil aviation pilots, 769,000 new maintenance technicians, and 914,000 new cabin crew will be needed to fly and maintain the world fleet over the next 20 years.” She adds, “212,000 pilots are needed for North America alone. 193,000 maintenance technicians are needed for North America alone…There’s a good opportunity for students to pursue aviation jobs. A student can start at a regional airline at $60,000 a year.”

Hasselbring, senior director of the Aircraft Owners and Pilots Association (AOPA) High School Aviation Initiative, says AOPA offers a high school aviation STEM (science, technology, engineering, math) curriculum that “is free to high schools…, and provides two career pathways: pilot and drones [Unmanned Aircraft Systems or UAS].”

At age 16, “students can take the [Federal Aviation Association (FAA)] Private Pilot Knowledge Test or Unmanned Aircraft Systems Part 107 Remote Pilot Knowledge Test. Those who pass the UAS test can start a business piloting [drones]. They can work for many employers because they can legally fly a drone,” says Hasselbring.

Some students take the courses “just to learn something new and different. Then they realize they want to be pilots. That’s why the curriculum is used as in-school courses only, to hook in students who may not have considered those careers before. They’re not as likely to choose the courses as an after-school club,” she asserts.

“The curriculum supports the Next Generation Science Standards (NGSS) and Common Core, and a lot of engineering practices are embedded. [It challenges] students with projects like testing foam board airfoils in a cardboard wind tunnel and modifying their designs,” as the Wright Brothers did in the wind tunnel they built, she observes.

“Students also learn about the NTSB [National Transportation Safety Board], and how they investigate accidents. [In one activity,] students are members of a Go Team investigating what caused an accident and what the recommendations of the NTSB should be,” Hasselbring notes.“The ninth- and 10th-grade curriculum will be available this fall. The 11th-grade curriculum will be available next year. By 2021, all four years [of the curriculum] will be available,” Hasselbring notes. “Schools must go through the application process in the fall” to receive the curriculum, she adds.

To use the curriculum, teachers must attend a three-day professional development workshop. “In-person attendance costs $200 and includes the opportunity to participate in hands-on activities and take a free flight in a small aircraft,” Hasselbring explains. Teachers can attend online at no charge.

“Last year, AOPA offered, for the first time, a Teacher Scholarship program that pays for flight training so teachers can become pilots. We gave 20 teachers $10,000 each. We hope to offer that again,” she reports. William Ervin, aerospace teacher at Dubiski Career High School in Grand Prairie, Texas, has used a variety of aerospace and aviation resources, including AOPA’s. “We use the AOPA curriculum [as a] pilot school. We teach and evaluate the curriculum,” he explains. He also has adapted the AOPA curriculum for his 11th and 12th graders and will be evaluating the 11th-grade AOPA curriculum this school year.

Last year, Ervin wrote Introduction to Aerospace and Aviation, a Career and Technical Education (CTE) innovative course for grades 9–11 that was approved by the Texas Education Agency (TEA). “Innovative courses allow districts to offer state-approved innovative courses to enable students to master knowledge, skills, and competencies not included in the essential knowledge and skills of the required curriculum,” according to the TEA website. Ervin’s course provides “the foundation for advanced exploration in the areas of professional pilot, aerospace engineering, and [UAS],” he explains.
Ervin notes TEA has “a bank of innovative courses” teachers can access. (See http://bit.ly/33nkicL.) He is currently developing a 10th-grade innovative course based on AOPA’s curriculum.

Manufacturing Aircraft

In the Utah Aerospace Pathways (UAP; http://uapathways.com) program, high school students take aerospace manufacturing training courses at their schools and at local technical or community colleges. Students then have an externship with one of UAP’s participating companies during senior year and graduate with a certificate in aerospace manufacturing. “Industry partner companies (Boeing, Janicki, Hexcel, Albany Engineered Composites, Orbital ATK, Kihomac) joined with Hill Air Force Base [located near Ogden, Utah] because they felt the need to build their workforce and training,” says Sandra Hemmert, CTE Specialist for Granite School District in Salt Lake City, Utah. UAP was created “to help students gain skills for industry and college,” Hemmert maintains.

UAP was the start of Talent Ready Utah, an initiative of the Governor’s Office of Economic Development and the Utah Department of Workforce Services. “It is an amazing partnership of government agencies, industry, and education,” Hemmert observes. Usually it takes two years to develop new high school courses, but “within six months, [UAP] had four new courses. Everything moved faster to address the needs of the industry partners,” she contends.

“The industry partners developed a skills list that was used as the basis of the state standards,” says Hemmert. “To create the curriculum, we had 12 teachers who did a two-week internship in all of the companies. We brought them in with company partners to develop the curriculum. Our teachers didn’t know anything about [composite materials], but the industry partners had an idea about how to get kids excited [about learning]: doing hands-on activities with them.”

UAP was piloted in two school districts—Granite and Davis, in Farmington— and has since expanded to four more Utah school districts. Salt Lake Community College and Davis Technical College in Kaysville are the original UAP partner colleges.

“Students are guaranteed an interview with any of the participating companies after earning the certificate,” says Hemmert. “Students can apply to any of the companies…If one company can’t hire a student, it will support [the student in obtaining a job] with the other companies,” she contends. Students can earn as much as $19 per hour right after high school.

“We have kids who do something else for two years, but they can still have the interview if they have earned the certificate. Some kids go to college and say, ‘It isn’t for me,’ but the certificate gives them a job opportunity. If they decide to attend college, the companies [reimburse] tuition…for employees…A lot of kids are going to college to become engineers and also working for a company to get their tuition reimbursed,” Hemmert relates.

“In addition, the program [educates students about] jobs they didn’t know about. There are so many jobs in aerospace manufacturing,” she asserts.

This article originally appeared in the September 2019 issue of NSTA Reports, the member newspaper of the National Science Teachers Association. Each month, NSTA members receive NSTA Reports, featuring news on science education, the association, and more. Not a member? Learn how NSTA can help you become the best science teacher you can be.

The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

Follow NSTA

Students in William Ervin’s aerospace class at Dubiski Career High School in Grand Prairie, Texas, gather around the flight simulator used in the class.

Cindy Hasselbring reads from the Boeing Pilot and Technician Outlook report: “804,000 new civil aviation pilots, 769,000 new maintenance technicians, and 914,000 new cabin crew will be needed to fly and maintain the world fleet over the next 20 years.” She adds, “212,000 pilots are need

Subscribe to
Asset 2