Skip to main content
 

It's Elementary: Investigating Student Work

By Cindy Workosky

Posted on 2017-08-24

Teachers wear many hats in the classroom. We are doctors, therapists, IT technicians, politicians, and entertainers, but the one hat we wear that is essential for student learning is the detective’s hat. As detectives, we gather and analyze evidence to help us understand what our students know and don’t know and what misconceptions they may have. Instead of this “detective work,” though, teachers often consider student work as an end product to assess learning rather than a tool to investigate student learning.

My young students did not come to me as a blank slate. They brought a bouquet of preconceptions about science, and those preconceptions either worked for or against their ability to understand the core ideas in science. I quickly learned that I could “teach my heart out” and do fun activities, and my students could regurgitate what I taught them. However, they did not comprehend the new concepts as well as I thought they had, and they certainly could not apply that knowledge to other concepts in science. What was I doing wrong? 

What did I learn from student work?

After careful reflection and collaboration with colleagues, I decided I needed to stop focusing on the end product and start focusing on the entire process of student learning. I needed to use my students’ work to my advantage. So I put on my detective’s hat. My job was to dig deep and investigate my students’ work more effectively so I could uncover what my students knew; what they didn’t know; and what they thought they knew, but had some misunderstandings about.

Based on students’ performance tasks and formative assessments, I could learn more about their preconceptions about core ideas and their ability to solve problems. Then I could capitalize on this information to help guide my instruction. I could also use that information to give my students quality academic feedback as they learned the core ideas and crosscutting concepts, rather than waiting until after they completed a performance task or took the summative assessment.

As my focus on how I analyzed students’ work and used that information to enhance instruction changed, I had to change what I was asking my students to do so I could gather the information I needed to help them learn as they solved real-world problems. I began to ask better, more intentional questions, and I made my performance tasks three- dimensional. Talk Moves was a strategy I began using to improve my students’ academic discourse. Having students argue about ideas in science in a productive way gave me powerful insight into their thinking and helped me gather the evidence I needed to inform my classroom instruction.

The most powerful change I made was to use formative assessment more effectively. Previously, my formative assessment toolbox consisted of quick checks that did not actually provide much information about how my students were thinking about science. A colleague told me about Page Keeley and her work with formative assessments. Using probes from Keeley’s Uncovering Student Ideas in Science book series and her Science Formative Assessment Classroom Techniques (FACTS), I was better able to gather the evidence I needed about how my students were thinking.

I used the information I gleaned from science probes, FACTS strategies, and other formative assessments to provide quality academic feedback to my students. These formative assessments allowed me to work smarter and use my instructional time more wisely. I enhanced classroom activities so I could better address my students’ needs, help them explore science phenomena, deepen their understanding of the core ideas as they engaged in the practices, and apply their knowledge to other disciplines in science.

With formative assessments, there usually is no need to assign scores to individual students. Instead of scoring rubrics, you might use more informal criteria that will help you quickly see what you need to know in order to make instructional decisions and better support your students.” —Seeing Students Learn Science (NAS 2017)

What can students learn from student work?

Learning is a journey, a journey that students can and should be part of. Unfortunately, students are often excluded from that journey altogether. Teaching and learning are things that are done to them rather than things they actively participate in. Teachers should encourage students to take part in their own learning journey by empowering them to analyze their own thinking. The best way to accomplish that is by allowing them to interpret their work.

When students are involved in understanding the purpose of their work and how to analyze it, both teachers and students experience a common vision of what students are expected to know and do. This is critical in improving student learning.

I used to think that showing students the scoring rubric before they completed a task helped them understand my expectations. Unfortunately, just as focusing on the end product didn’t help me understand their thinking or learning needs, having students examine a rubric didn’t help them comprehend what I expected them to do and the knowledge I needed them to apply. They, too, needed to become detectives.

To help students investigate what I was expecting of them, I had them critique examples of completed student work. I wanted them to better understand what effective and ineffective work looked like. These examples included both exemplars and non-exemplars, examples of work that demonstrated gaps in understanding. (Note: I always wrote the non-exemplar samples using common misconceptions and inaccurate understandings my students traditionally held. I never used the work of students currently in my classroom.)

Giving students an opportunity to see what they should do and the connections they should make helped them better understand what I expected them to know and do. Giving them a chance to identify the areas of refinement in the work samples that demonstrated weaker understanding helped them understand what mistakes they needed to avoid. Students also could see what different levels of performance looked like.

Furthermore, students often saw their own misunderstandings in the non-exemplars. Analyzing these samples gave them a safe space to identify their own misconceptions, then engage in the practices of scientists and engineers to better understand the core ideas and crosscutting concepts. They refined their knowledge through investigations, explorations, and research.

Analyzing samples of student work made them think critically and deepen their own knowledge, and gave them the tools needed to really analyze their own thinking. It created a safer, more engaging learning culture in my classroom. Indeed, the changes I implemented yielded exciting results.

I observed that my students began to ask better questions and made less careless mistakes while completing performance tasks. They were more confident as they designed solutions to problems and more open to the academic feedback I gave them on their own work and formative assessments. Their ability to reason and engage in academic discourse improved, and they were much more willing to reconsider their previously held misconceptions. Their detective work paid off. The quality of my fourth graders’ work soared, and my students became more responsible, critical thinkers. 

As teachers encourage students to explore their natural curiosity and understand the natural world surrounding them, it is essential that teachers analyze students’ work on formative assessments and performance tasks to identify students’ preconceptions and make sound decisions in the classroom. When teachers and students work together as detectives to gather evidence about how students are thinking about their thinking, meaningful learning can take place.


K. Renae Pullen

 

K. Renae Pullen is the K–6 science curriculum-instructional specialist for Caddo Parish Public Schools in Louisiana. Pullen currently serves on the Teacher Advisory Council of the National Academies of Sciences, Engineering, and Medicine. She received a Presidential Award for Excellence in Science Teaching in 2008. Read her blog, and follow her on Twitter: @KrenaeP.

 

This article was featured in the August issue of Next Gen Navigator, a monthly e-newsletter from NSTA delivering information, insights, resources, and professional learning opportunities for science educators by science educators on the Next Generation Science Standards and three-dimensional instruction. Click here to access other articles from the August issue on assessing three-dimensional learning. Click here to sign up to receive the Navigator every month.

 

The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

Future NSTA Conferences

2017 Fall Conferences

National Conference

Follow NSTA

Facebook icon Twitter icon LinkedIn icon Pinterest icon G+ icon YouTube icon Instagram icon
 
 

 

Teachers wear many hats in the classroom. We are doctors, therapists, IT technicians, politicians, and entertainers, but the one hat we wear that is essential for student learning is the detective’s hat. As detectives, we gather and analyze evidence to help us understand what our students know and don’t know and what misconceptions they may have. Instead of this “detective work,” though, teachers often consider student work as an end product to assess learning rather than a tool to investigate student learning.

 

Seeing Students Learn Science

By Cindy Workosky

Posted on 2017-08-24

It is truly an exciting time in science education. Science educators across the country are adapting to a new vision of how students learn science guided by the Framework for K–12 Science Education (Framework). As a result, science instruction is changing to better tap into students’ natural curiosity and deepen their understanding of the world around them.

As instruction changes, assessments need to change as well. Many science educators recognize that traditional assessments are not appropriate for capturing three-dimensional science learning. But they may not know what assessments of three-dimensional learning should look like, nor how they can be used effectively in science classrooms.

Seeing students learn scienceThe Board on Science Education (BOSE) at the National Academy of Sciences has a new resource that can help. BOSE is the group responsible for developing the Framework, and we have been working hard to continue to offer guidance to educators as they strive to make the new vision a reality in classrooms. In March 2017, BOSE released a new book on formative assessment for science, Seeing Students Learn Science. The book draws on research-based recommendations for assessment to explore how classroom teachers can use assessments as part of instruction to advance students’ three-dimensional learning.

Traditional science assessments do not allow teachers to fully understand students’ mastery of science and engineering practices, nor do they provide insight into students’ learning trajectories. In contrast, effective classroom assessments of 3-D science learning can help teachers collect information about students’ understanding of core ideas and crosscutting concepts, as well as students’ ability to engage in the scientific and engineering practices. Good assessments of 3-D science learning can help teachers make decisions about next steps for learning and identify the supports that individual students or groups of students may need. They can also help students take control of their own learning by helping them understand what they have mastered and where they may need more practice. A major goal is for assessment to become an integral part of science instruction, rather than an interruption

The new book is designed to help teachers create and implement classroom assessments that capture three-dimensional learning. While transitioning to a new assessment system will be a gradual process, change begins at the classroom level, and individual educators can begin to implement new approaches immediately. Seeing Students Learn Science is filled with examples of innovative assessment formats, strategies to embed assessments in classroom activities, and ideas for interpreting and using information from these assessments. It also provides ideas and questions educators can use to reflect on what they can adapt right away—and what they can work toward over time—to ensure that instruction drives assessment, not the other way around.

The book is organized around key questions educators may have about the new types of assessments.

What’s really different? Gives a quick overview of how ideas about science learning and instruction have changed and why different kinds of assessments are needed. 

What does this kind of assessment look like? Highlights a few examples to see how these ideas and principles work in practice. 

What can I learn from my students’ work? Examines more deeply the information educators can obtain from a variety of assessments—and how they provide evidence of students’ thinking. 

How can I build new kinds of assessments into the flow of my instruction? Describes ways to adapt assessments already in use and to design new assessments that support the changes science teachers are making in instruction. 

How can I work with others in my school, district, and state? Focuses on how teachers can connect with developments outside of the classroom, and explores assessment systems, ways of reporting assessment results, and assessment for monitoring purposes.                                      

A key message of the book is that educators can lead the way in transforming science assessment. We all know that new state and district assessments will be needed. The transition to new large-scale assessments may be a complicated one for states and districts, and will likely pose challenges that will take time to solve. Individual educators, though, can lead the way in adapting assessment practices to new approaches to science instruction. With adequate professional development support, and the resources provided in Seeing Students Learn Science, educators can begin to redesign assessments in their own classrooms and champion new approaches in their schools and districts.


Heidi Schweingruber is director of the Board on Science Education at the National Research Council (NRC). She co-directed the study that resulted in the report A Framework for K–12 Science Education (2011). She served as study director for a review of NASA’s pre-college education programs completed in 2008 and co-directed the study that produced the 2007 report Taking Science to School: Learning and Teaching Science in Grades K–8. Before joining the NRC, Schweingruber worked as a senior research associate at the U.S. Department of Education’s Institute of Education Sciences . Schweingruber holds a Ph.D. in psychology and anthropology and a certificate in culture and cognition from the University of Michigan.

 

This article was featured in the August issue of Next Gen Navigator, a monthly e-newsletter from NSTA delivering information, insights, resources, and professional learning opportunities for science educators by science educators on the Next Generation Science Standards and three-dimensional instruction. Click here to access other articles from the August issue on assessing three-dimensional learning. Click here to sign up to receive the Navigator every month.

 

The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

Future NSTA Conferences

2017 Fall Conferences

National Conference

Follow NSTA

Facebook icon Twitter icon LinkedIn icon Pinterest icon G+ icon YouTube icon Instagram icon
 
 

 

It is truly an exciting time in science education. Science educators across the country are adapting to a new vision of how students learn science guided by the Framework for K–12 Science Education (Framework). As a result, science instruction is changing to better tap into students’ natural curiosity and deepen their understanding of the world around them.

Subscribe to
Asset 2