Success Factor 5: Data

Written by Sharon Thomas

November 15, 2018

Some topics in education are more divisive than others. Testing, grading practices, and suspension policies are always potential faculty battlegrounds, but even seemingly innocuous topics like the school calendar or the assignment of Spirit Days can sometimes be volatile. The topic of data (and especially how we use data) is one of those lightning-rod subjects.

‍Data Defined

‍“Data” is a huge subject, but public perception of that term regarding schools can often be reduced to be interpreted as merely “test scores.” But the types of data available in schools are many and varied, and each one has a role to play in improving learning for students:

  • Formative assessment items
  • Summative assessment items
  • Attendance information
  • Discipline referral information
  • Grades
  • Student conferences
  • Parent conferences
  • Medical information
  • IEPs
  • 504 plans
  • And on and on and on

Educators, depending on their role in a school system and on their personal mission and beliefs, value some data pieces more than others. The key is using as many relevant and necessary pieces of data as possible in a given situation to ensure that we meet student needs.

So why the controversy? Ultimately, not everyone agrees about which data are “relevant,” which are “necessary,” and, oftentimes, even which pieces of data are valid, reliable, and ethical. Additionally, important decisions that affect students are often made on the basis of one piece of data, and the likelihood that all of the educators involved believe that one piece of data is relevant, necessary, valid, reliable, and ethical is slim, to say the least. I know few teachers who think good decisions are made on the basis of a single piece of data.

Multiple sources of data to make decisions is thus more effective in engaging teachers in change, but deciding which data to use is a challenge whether setting goals at a system, school, or classroom level. For instructional coaches, we organize the data they use with teachers to set goals for students into four categories.

Video

‍Classroom videotape is by far the most compelling data to use to help a teacher to set a classroom goal. Video is powerful because it has no agenda of its own: it just shows what the classroom looked like during the lesson from a different point of view.

Teachers see so much on film that is different from how they perceived the lesson while teaching it that they usually come up with multiple goals from watching just one classroom segment. Video is so rich for teachers that the coach often must assist the teacher in focusing on a particular goal because there are so many things the teacher wants to change.

Student interviews

Asking students about where they are struggling academically, whether they are engaged in class, or why particular behavior issues exist in the room can be a treasure trove for teachers and coaches. Student voices are often left out of the data mix when setting goals at any level, but the insight they provide about how they perceive lessons or each other can give the teacher other perspectives that they, as adults, may never have considered.

When setting my own goal concerning classroom management several years ago, student interviews gave me better information to set my goal than any of the observational data my coach had gathered, not because of anything she did but because the students understood the dynamics of the room better than either I or my coach did. Their ideas showed me the way to improving the environment for everyone.

Student work

‍This one comes as no surprise. For years now, educators have used student grades, performance on classroom assignments and tests, and performance on large-scale standardized tests to analyze academic achievement. When working with teachers on academic achievement goals for students, coaches can assist teachers in analyzing student work to identify areas of misunderstanding, areas of strength, and areas to target for reteaching.

Student work is typically readily available in any classroom, but deciding which assignments are the most helpful in setting the goal and analyzing progress on the goal is key in making deep instructional change. Not relying on only one type of assignment to gauge progress is likewise critical.

Observation data

‍This one is last on our list of data for coaches because, even when a coach and teacher have a good rapport and a trusting relationship, observation data may always feel somewhat evaluative to the teacher because the experience feels exactly the same as when a teacher is observed as part of their annual evaluation. That said, if the teacher would like the coach to observe for specific behaviors in the classroom as part of an engagement or behavior goal (or even for specific behaviors for an academic goal), then observation data can be helpful. (Click here for some examples of data that coaches can collect in classrooms on different types of goals from the Resources website for The Impact Cycle.)

Coaches can make an observation feel less evaluative. Meeting with the teacher beforehand to let the teacher explain exactly what he or she wants the coach to do during the observation keeps the teacher in the driver’s seat of the coaching process. Also helpful is something that my coach did after observing me: Instead of walking out of the room and taking the observation data with her (as evaluators would do), she left the data forms on my desk on her way out to show that the data were mine, not hers. That gesture was so powerful that, after she left the data on my desk for the first time, I trusted her completely.

‍

‍ICG Certification: What Scoring Taught Us This Year about Data

‍The certification candidates who submitted the data portfolio entry this summer are extremely adept at collecting and managing data. Their forms and processes for collecting data were overall very strong. The area that sometimes presented problems was instead with the goals they set. Because some of the goals were unclear, it was hard to evaluate how well the data they included were aligned with the goal. We’ve seen goal writing as an area of challenge when working with workshop participants. As a result, our workshops now include more time spent on structuring and phrasing goals. Our goal-setting process is a variation on the common SMART goal framework and has been modified to include an important element of Jim’s coaching model: the research on how adults respond best to change and receiving support. Our goal-setting acronym (because this is education, and you need a good acronym) is PEERS. The goal must be

  • Powerful for students (improving on the goal will aid them considerably in school and/or life)
  • Emotionally compelling for the teacher (the teacher cares about the goal and wants to work hard on it)
  • Easy to understand and to communicate (not “easy to achieve” necessarily)
  • Reachable (measurable–has a clear target and a way to track progress on that target)
  • Student-focused (starts with “Students willâ€Ĥ,” not “Teacher willâ€Ĥ”)

In the research on adults and change, the only real motivation that causes adults to make a deep and lasting change in their lives is if the goal is in line with their purpose in life, with the kind of person (or, in our case, the kind of teacher) they want to be. If adults care about the goal in that way, then they will “dig in” on the goal and work to achieve it even in the face of obstacles. So much of professional learning fails because system or school goals are not emotionally compelling for teachers. Teachers do not see the change as in line with their purpose in life as teachers, and thus surface-level-only implementation of instructional initiatives is common, and that means that the level of real change is negligible. In addition, using the phrasing of “Students willâ€Ĥ” not only keeps the focus on what teachers care about most (students); it also avoids implying that the teacher is in some way defective (as “Teacher willâ€Ĥ” does), which helps the teacher to feel powerful as a professional and part of a problem-solving team focused on outcomes for kids. Here are some examples of PEERS goals.

  • An average of 80% of students will self-report that they are authentically engaged in the lesson from the start of class on 5 data collections.
  • An average of 95% of students will use the TOWER mnemonic language to describe their suggestions each time they give meaningful feedback during peer review on 3 data collections.
  • Students will use no more than 5% of classroom time for transitioning to new instructional activities on 4 data collections.

Some submitted goals on the data portfolio entry were missing one or more parts of this framework, so tying their data directly to progress on the goal was difficult. For the teacher, specificity in the goal aids him or her in knowing exactly what “success” looks like but can be revised as necessary depending on what the teacher and coach learn during implementation. Ensuring that the goal is clear and that the data measures tied to it are clearly aligned prevents frustration during implementation because everyone knows the target and the progress on the target at all times. Wordsmithing goals can sometimes feel nitpicky or like a waste of time, especially when everyone wants to get started working on the goal. (I still feel some residual restlessness when I think back on how much time I spent in school improvement team meetings working on the wording of a mission statement instead of working on the actual mission.) But clarity matters, especially in the heat of implementation and trying a new approach. Time spent focused on aligning goals and data is time well spent to improve not only instruction but also to lessen frustration for everyone.