Skip to main content
 

It's Elementary: Investigating Student Work

By Cindy Workosky

Posted on 2017-08-24

Teachers wear many hats in the classroom. We are doctors, therapists, IT technicians, politicians, and entertainers, but the one hat we wear that is essential for student learning is the detective’s hat. As detectives, we gather and analyze evidence to help us understand what our students know and don’t know and what misconceptions they may have. Instead of this “detective work,” though, teachers often consider student work as an end product to assess learning rather than a tool to investigate student learning.

My young students did not come to me as a blank slate. They brought a bouquet of preconceptions about science, and those preconceptions either worked for or against their ability to understand the core ideas in science. I quickly learned that I could “teach my heart out” and do fun activities, and my students could regurgitate what I taught them. However, they did not comprehend the new concepts as well as I thought they had, and they certainly could not apply that knowledge to other concepts in science. What was I doing wrong? 

What did I learn from student work?

After careful reflection and collaboration with colleagues, I decided I needed to stop focusing on the end product and start focusing on the entire process of student learning. I needed to use my students’ work to my advantage. So I put on my detective’s hat. My job was to dig deep and investigate my students’ work more effectively so I could uncover what my students knew; what they didn’t know; and what they thought they knew, but had some misunderstandings about.

Based on students’ performance tasks and formative assessments, I could learn more about their preconceptions about core ideas and their ability to solve problems. Then I could capitalize on this information to help guide my instruction. I could also use that information to give my students quality academic feedback as they learned the core ideas and crosscutting concepts, rather than waiting until after they completed a performance task or took the summative assessment.

As my focus on how I analyzed students’ work and used that information to enhance instruction changed, I had to change what I was asking my students to do so I could gather the information I needed to help them learn as they solved real-world problems. I began to ask better, more intentional questions, and I made my performance tasks three- dimensional. Talk Moves was a strategy I began using to improve my students’ academic discourse. Having students argue about ideas in science in a productive way gave me powerful insight into their thinking and helped me gather the evidence I needed to inform my classroom instruction.

The most powerful change I made was to use formative assessment more effectively. Previously, my formative assessment toolbox consisted of quick checks that did not actually provide much information about how my students were thinking about science. A colleague told me about Page Keeley and her work with formative assessments. Using probes from Keeley’s Uncovering Student Ideas in Science book series and her Science Formative Assessment Classroom Techniques (FACTS), I was better able to gather the evidence I needed about how my students were thinking.

I used the information I gleaned from science probes, FACTS strategies, and other formative assessments to provide quality academic feedback to my students. These formative assessments allowed me to work smarter and use my instructional time more wisely. I enhanced classroom activities so I could better address my students’ needs, help them explore science phenomena, deepen their understanding of the core ideas as they engaged in the practices, and apply their knowledge to other disciplines in science.

With formative assessments, there usually is no need to assign scores to individual students. Instead of scoring rubrics, you might use more informal criteria that will help you quickly see what you need to know in order to make instructional decisions and better support your students.” —Seeing Students Learn Science (NAS 2017)

What can students learn from student work?

Learning is a journey, a journey that students can and should be part of. Unfortunately, students are often excluded from that journey altogether. Teaching and learning are things that are done to them rather than things they actively participate in. Teachers should encourage students to take part in their own learning journey by empowering them to analyze their own thinking. The best way to accomplish that is by allowing them to interpret their work.

When students are involved in understanding the purpose of their work and how to analyze it, both teachers and students experience a common vision of what students are expected to know and do. This is critical in improving student learning.

I used to think that showing students the scoring rubric before they completed a task helped them understand my expectations. Unfortunately, just as focusing on the end product didn’t help me understand their thinking or learning needs, having students examine a rubric didn’t help them comprehend what I expected them to do and the knowledge I needed them to apply. They, too, needed to become detectives.

To help students investigate what I was expecting of them, I had them critique examples of completed student work. I wanted them to better understand what effective and ineffective work looked like. These examples included both exemplars and non-exemplars, examples of work that demonstrated gaps in understanding. (Note: I always wrote the non-exemplar samples using common misconceptions and inaccurate understandings my students traditionally held. I never used the work of students currently in my classroom.)

Giving students an opportunity to see what they should do and the connections they should make helped them better understand what I expected them to know and do. Giving them a chance to identify the areas of refinement in the work samples that demonstrated weaker understanding helped them understand what mistakes they needed to avoid. Students also could see what different levels of performance looked like.

Furthermore, students often saw their own misunderstandings in the non-exemplars. Analyzing these samples gave them a safe space to identify their own misconceptions, then engage in the practices of scientists and engineers to better understand the core ideas and crosscutting concepts. They refined their knowledge through investigations, explorations, and research.

Analyzing samples of student work made them think critically and deepen their own knowledge, and gave them the tools needed to really analyze their own thinking. It created a safer, more engaging learning culture in my classroom. Indeed, the changes I implemented yielded exciting results.

I observed that my students began to ask better questions and made less careless mistakes while completing performance tasks. They were more confident as they designed solutions to problems and more open to the academic feedback I gave them on their own work and formative assessments. Their ability to reason and engage in academic discourse improved, and they were much more willing to reconsider their previously held misconceptions. Their detective work paid off. The quality of my fourth graders’ work soared, and my students became more responsible, critical thinkers. 

As teachers encourage students to explore their natural curiosity and understand the natural world surrounding them, it is essential that teachers analyze students’ work on formative assessments and performance tasks to identify students’ preconceptions and make sound decisions in the classroom. When teachers and students work together as detectives to gather evidence about how students are thinking about their thinking, meaningful learning can take place.


K. Renae Pullen

 

K. Renae Pullen is the K–6 science curriculum-instructional specialist for Caddo Parish Public Schools in Louisiana. Pullen currently serves on the Teacher Advisory Council of the National Academies of Sciences, Engineering, and Medicine. She received a Presidential Award for Excellence in Science Teaching in 2008. Read her blog, and follow her on Twitter: @KrenaeP.

 

This article was featured in the August issue of Next Gen Navigator, a monthly e-newsletter from NSTA delivering information, insights, resources, and professional learning opportunities for science educators by science educators on the Next Generation Science Standards and three-dimensional instruction. Click here to access other articles from the August issue on assessing three-dimensional learning. Click here to sign up to receive the Navigator every month.

 

The mission of NSTA is to promote excellence and innovation in science teaching and learning for all.

Future NSTA Conferences

2017 Fall Conferences

National Conference

Follow NSTA

Facebook icon Twitter icon LinkedIn icon Pinterest icon G+ icon YouTube icon Instagram icon
 
 

 

Assessment Elementary

Asset 2