By Dr. Bill Dagget Founder, International Center for Leadership in Education

This blog post originally appeared on HMH’s Shaped blog on October 04, 2017

Educator efficacy and learning have been long studied, and new ideas or new takes on old ideas are routinely reviewed. While there remains debate about what works and why, new and in-depth research from the Institute of Education Sciences (IES) confirms what we at the International Center for Leadership in Education have known from our own research: frequent, collaborative, rubric- and research-based classroom observations and feedback tied to student outcomes can advance teacher efficacy and student achievement (Wayne et al., 2016). Here is something else we learned: students who engage better with their teachers perform well in class and on tests, care more about learning, and ultimately get the best grades and reach more of their long-term educational goals.

To that end, in the past two years, we have conducted 10,363 classroom observations using our Rigor, Relevance, and Learner Engagement Rubrics as part of our Collaborative Instructional Review (CIR) coaching process in more than 300 schools across the nation. The careful tracking and analysis of data has yielded a clear picture of where many public schools struggle and where there is progress.

Pitfalls of Common Teacher Improvement Approaches

In these most widely practiced methods of teacher improvement, we’ve uncovered some areas of concern and gaps in efficacy. Part II of this series will outline some strategies for improving on these traditional methods:

Professional development

It’s not unusual for districts or schools to pour resources and time into professional development. But the mere acquisition of new skills does not guarantee their successful use in the classroom. Nor does it necessarily address the end goal of student learning. Information acquisition and application are two distinct skills. For teacher learning to elevate student learning, the research shows that follow-up coaching is key. Without it, the resources invested in professional development often yield little return.

Principal-led classroom observations

When applied under a specific set of circumstances, including a shared understanding of rubrics, observation can generate a solid return on professional learning. However, a review of teacher evaluation from the Brookings Institute found that when an observer knows the teacher or comes into the classroom with preconceived notions, this bias can impair the objectivity needed to deliver worthwhile feedback.

Classroom observation has enormous potential to identify where a teacher’s practices are or are not driving rigorous and relevant learning in an engaging environment. But the biases that observers might bring into the classroom can lead to unfounded overly positive or negative appraisal of a teacher.

The report concludes that classroom observers from beyond the building will provide more objective and valid observation data.

Value-added analyses of test scores

 Finally, to track and measure a teacher’s role in student learning, many schools rely on value-added analyses of state standardized test scores. It’s reasonable to assume that a set of test scores could speak to a teacher’s effectiveness; but this would be a mistake (National Education Association, 2010). Test scores alone do not include the context to make sound inferences. It is fact that some teachers have students in their classrooms who are difficult to teach, facing a challenging home life, or lacking at-level English skills. It is also a fact that some teachers have a classroom of gifted students. In either case, value-added scores will reflect distinct class circumstances and not accurately capture the teacher’s role in those scores.

Read more on the potential for impactful classroom observations, including details on ICLE’s Collaborative Instructional Review Process:

Classroom Observations Designed to Work: Better Instructional Support. Better Teachers. Better Student Outcomes. by Bill Daggett, Ed.D. and Linda A. Lucey, Ph.D.

Also, look out for Part II of our deep dive into this report right here on HMH’s Shaped blog.