Capturing Information from the Day to Day

I was having difficulty time finding a topic to write about that seemed to be applicable across all classes and grades. However, working in a kindergarten classroom last week provided me with inspiration on collection of data.
The alphabet chart – I bet it has been a while since you have thought about that unless you are a kinder teacher who is facilitating a class of word scientists currently exploring the alphabet chart. If you want to be amazed by all that a 5 and 6 year can learn about letters, sounds, patterns and more from one tool, I suggest you stop into a kindergarten classroom for word study.

Picture this, 18 young children sitting and laying on the rug with their alphabet charts that are missing pictures and a stack of pictures to add. Working in partners, they go about the task of identifying the picture, stretching out the name, catching the first sound, and matching that sound to a letter (whew)! As we walked around coaching into the students’ work, we realized the need to be capturing this learning as it is so rich in information. We walked around to snap pictures of each of their boards. Quick, easy data – using what you are already doing to gather information to guide future instruction and reflect on past instruction.

By taking a photo at the same time, we were able to look at partners’ speed and accuracy of the task. Several partnerships had a completed and accurate chart, indicating they had the vocabulary of the pictures, knew the concept, could apply the concept of hearing an initial sound, and then were able to connect the sound to a letter. Other partnerships were accurate but only halfway done, showing they understand and can complete the skills but are not yet automatic. Still a few partnerships only had a few on their board or pictures on incorrect spots, more investigation we decided is needed for those students to determine if it was the vocabulary, phonological awareness or letter sound connection that made the task challenging. On top of that, we gathered observational data on who could take turns and work with their partner. We didn’t have to add anything, just have a way to collect the data and then turn around and use it. We will reflect on this data in our weekly planning meeting to determine next steps for minilessons and groups and make notes on the previous week of lessons.

I write this for several reasons, one my love of data, as many of you have seen me nerd about over the years. Also, this one task reminded me that data is one topic that speaks to all teachers and faculty at ISB no matter the grade or position; we all collect and use data. It reminds us that collection of data does not need to also be additional work if we set up systems to collect data for what we already do.

We continue to reflect on our work at ISB and our alignment with our data belief, below are our data beliefs, updated as of November 2019.  As we know, data is collected and used in many different ways, in reflecting on our practices as described above I found the following most related for this example (shown with an *).

Data Beliefs at ISB

Updated Nov. 2019

Beliefs about the role of data:

  • The most valued data is teacher-collected formative data that is used to differentiate student learning. *
  • Quality, valid data from multiple sources creates shared ownership of student learning.
  • Data analysis supports both student and teacher growth.*
  • Data should be shared in a safe and open environment that depersonalizes ownership in order to support our understanding of student learning.
  • The act of reflecting on data is a part of our role and professional responsibilities.

How we use data:

  • Data is used to differentiate instruction to support all students.*
  • Data is used to inform instruction across a range of levels and for a variety of purposes.*
  • Data is used to document student progress for the purpose of reporting and school program improvement.*
  • Collaborative teams explore data for patterns.
  • Processes and protocols assist in establishing supportive environments to look at student learning.


Student Learning Data At a Glance

Last week I gathered my in-house focus group (yes, my daughters) to ask them what they hope that teachers will learn about them in the first few weeks of school. I was pleased when one answered, “I hope they understand my strengths and weaknesses, so they can help me and not get frustrated when I don’t understand.” 
“Wow!” I thought. “How great! I’ve just spent two years working on a tool to give to ISB teachers in order support them in doing just that.” 
The tool we’ve been working so hard on is a new data portal—a system that connects right into powerschool, that we can use to pull together all the disparate sources of academic data we have about each student and present that data to teachers at a glance. 
The data that’s entered into the data portal is customized by Elementary, Middle, and High school, but the basics are the same: we have two types of data: external assessments (like the MAP test and others), and internal common assessments (such as Writing, Math, and PE, as well as Science, Social Studies in middle and high school.) 
Want to know your students’ reading comprehension so you can check if they’ll be able to make sense of the text you’re handing them? 
We’ve got you covered.  
Want to see which students have similar strengths and growth areas, so you know which small group might benefit from a strategy lesson on a skill they learned last year? 
We can show you that!

Data like this will help us most if we keep in mind the strengths and limitations of the two types of data: 
External test data (like MAP data) 

  • Reliable measure of students’ basic skills in reading, math, and language usage. 
  • Often you can look back at several years’ data to see the growth trends 


  • Cannot measure very deep cognitive complexity, such as analysis, synthesis, evaluation skills 

ISB Common Vertical Assessment data: 
Strength: Measures more cognitively complex tasks, such as 

  • Analyzing the strengths and limitations of a primary source in social studies 
  • Modeling a math problem or scientific phenomenon 
  • Writing a short narrative about their own life 
  • Assessing their own fitness and making a plan to improve it in Health and PE. 
  • Identifying relevant evidence to support an argument  

The data on whether the student can achieve these more cognitively complex goals provides a useful counterpoint to the sometimes-limiting external assessment data. It’s not hard to imagine an emergent bilingual student whose MAP scores looks low because she doesn’t have the language skills to comprehend the questions, but who is an excellent critical thinker who can analyze and model a scientific phenomenon 

  • Sometimes the common assessment data is confounded by things like the way the question was worded (so the student got confused and didn’t apply the skill they were meant to be measuring).  

Together, these two types of data aim to provide a balanced view of what each student’s learning journey has been.  
Our job as teachers is to learn deeply about our students, so we can meet them where they are, give them just the right amount of challenge to help them grow.  
How can we help our students work in the zone of proximal development, that sweet spot for optimal growth, if we don’t know where they’re starting? 

Image from Verbal to Visual

So, a practical suggestion: 
After you do your first pre-assessment, put it side by side with the student view of the portal.  (If you need help getting in, here is a step by step instruction manual!)

  • Does anything surprise you?   
  • What are this students’ areas of strength and where might they need extra support?  
  • Has this student been growing? 

Next, consider your actions based on the data.

  • Do you need to reteach an earlier skill? 
  • Create a small group?
  • Collaborate with a coteacher to target specific skills?
  • How might this data influence what you assign as homework (independent practice!)

How you follow up on the data is up to your professional judgement, requiring all the art and skill you have as a teacher.
I want to leave you with a caveat: a data portal will never tell you which student was the lead of the school play last year, or which one worked for ages perfecting his design for a robotic animal or which one loves reading graphic novels.  Listening to your students, making that connection, combined with providing just the right amount of challenge—there’s nothing that can improve your students’ learning more effectively than that. 

Data Doesn’t Have to Be Scary 

As we all know, data performs a significant role on improving teaching and student learning nowadays. However, data looks scary some of the time. There are always tons of data, and it takes brave of us to dive into it for figuring out what it can provide us on a higher level based on evidence on guiding our future work, so that it plays its role as what we think it is supposed to do!
How can we as a school uses data wisely? Actually, we are now in a collaborative process for using data in Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning, to pursue effective actions taken for improvement.
In order to make sure collaborative work being carried on positively and effectively, at the very beginning, we’d better set our norms, and besides, monitor our thinking process to maintain a relentless focus on evidence. In Data Wise introduced by Harvard University, assuming positive intentions, taking an inquiry stance and grounding statements in evidence can be good norms for team members for working together. And, something we’ve learned in our work is that human beings have a tendency to see patterns, make inferences and judgments, and draw conclusions quite quickly from just a small amount of data. This is an important skill but at the same time it can be dangerous for leading us to misinformed judgments when we are too quick to draw inferences from a small quantity of data. A mental model called “the Ladder of Inference” is especially useful in helping educators resist this tendency and stay focused on the evidence. The pattern is whenever I perceive something, I first select some data, then, I add some interpretation, draw conclusions, and finally take actions. The actions I take then influence what kind of data I collect next. Considering the loop, it can be very dangerous to go up the ladder too quickly. So it’s better for all of us to use this model to help each other maintain a relentless focus on evidence.
Holding the norms and the mental model when working and learning together, at ISB, we have organized for collaborative work, and we are working on building assessment literacy through several ways, for example, relevant books provided at OOL, staff training opportunity, outside specialist visiting, etc.. For a quick and better understanding of data, we have worked on to visualize our data as better as we can. We’ve prepared graphs and charts instead of large amount of data displayed in our meetings. Teachers are advised to answer questions like “what do I see, what do I notice, what do I wonder” to better understand, interpret data based on the evidence.
With the foundation laid, collaborative work organized and assessment literacy built, we are able to move on in our data driven process. Instructional team leaders choose a focus area for the coming semester or school year, some time, it has been declared from the top. The focus area tells team leaders which data sources on the data inventory should be paid attention first. They create a high level data overview that shows how students are doing in this focus area, for example, data from an annual performance assessment can be used here. In their meetings, they work together to see patterns, do analyzation and find the story behind the data. In their next meeting with the broad group of faculty, they’ve prepared the data being displayed by charts or graphs, so that it allows everyone to make sense to the data quickly. More importantly, opportunity is provided to teachers to find their own meaning about the data. However, this overview data will never tell how to improve learning and teaching, it brings curiosity among teachers about why the data looks like this, and teachers can then identify a priority question as the focus.
Although the overview data teachers looked at in the previous step may have given some hints about where students are struggling, it’s unlikely that it offered enough specific information to explain the reason for that struggle. So, another step, to dig into the student data, is important, because it makes sure that before jumping to conclusions about how to solve a problem, teacher and the whole team have a clear sense of exactly what that problem entails. Teachers examine a wide range of student data, then, come to a shared understanding of what student data show. At last, they identify a learner-centered problem, and are ready to examine instruction.
During this process, teachers examine a wide range of instructional data. They are all clear about the purpose of their observations in classrooms, which is not judging anyone, but finding out what happened! Then, they work together to gain a shared understanding. Through this whole process, teachers need to separate person from the practice, be reminded with the collaborative working norms themselves, and focus on practices itself not who is doing it, always stick to the fact, so that to identify a problem of practice. A problem of practice is directly related to the learner-centered problem based on evidence found when examining instruction, within teachers’ control. This problem of practice is a statement about practice, not a question. It is specific and small, so that it allows teachers to develop an action plan for addressing the problem of practice. There are some places to look for instructional strategies, such as rubrics for effective teaching, curriculum materials, external websites, and the expertise of instructional coaches who work in the school system.
Before jumping into action in the classroom, another step is needed. We’d better make a plan to assess progress. This is where we specify the evidence of student learning that we hope to see once the instructional strategy is in place. Our plan to assess progress will provide the information we need when we move on to the last step – Act and Assess. It’ll help us measure the extent to which our instructional strategy is working, or not working. We can choose assessments to measure progress and set student learning goals.
At last, we act and assess. We implement the action plan and the plan to assess progress. We also adjust the action plan, and what’s important, we celebrate the success! However, once we reach the last step, there’s still more to do. Noticing the shape of the Data Wise arrow, it points right back to step three “create data overview”. Each time we begin a new cycle of inquiry, teachers as a team bring the experience we gained from our previous efforts and use that to take on new and more challenging problems of practice with greater skill and insight.
Isn’t this something cool? When we work as a team, instead of looking scary, data does guide us on improving teaching and student learning when under design, carried out step by step!

Skip to toolbar