Karen Shakman brings significant expertise in collaborative research, policy analysis, and evaluation. Most recently, her work has focused on advancing the field's knowledge of educator effectiveness sytems and illuminating barriers and facilitators to sustaining K-12 education reforms. As the lead researcher for REL Northeast & Islands' Northeast Educator Effectiveness Research Alliance (NEERA), she supports educational leaders at the state and district levels who are managing the design and implementation of new educator evaluation systems, and works to deepen their understanding of program evaluation, research design, and data analysis. In this post, originally published on the REL Northeast and Islands’ blog, she describes the experience of studying the alignment between teacher evaluation and professional learning within a large New England district and reflects on the effect of the study's findings at the state level.
When the Northeast Educator Effectiveness Research Alliance created its research agenda in 2012, its core planning group wanted to study educator evaluation not as an end in itself, but as a vehicle for ensuring that schools provide the best instruction for all students. Their view of educator evaluation is that it should provide targeted and specific support that helps educators do their best work in the classroom.
To advance this agenda, over the past few years fellow REL researchers and I have worked closely with a large urban district here in New England to study the alignment between teacher evaluation and professional learning—in a true collaboration between the district, the alliance, and the REL.
Our district colleagues wanted to learn to what extent the feedback that teachers received from the district’s new evaluation system aligned with the types of professional learning activities teachers subsequently engaged in to improve their skills and practice.
In 2012, we began by creating a data catalog of the district’s relevant data sources. We documented the data available to answer potential research questions, identified data gaps, and developed a set of potential research studies to focus on the alliance goal described above. The data catalog led to several discussions with the district about the type of data they collected and what research questions were and were not viable as a result. For example, while the district was very interested in making the link between teacher evaluation and participation in professional learning activities, the data catalog revealed that the district lacked reliable and consistent data about the range of professional learning activities in which teachers engaged.
For our district colleagues, this was a bit of a revelation. While they collected a wide range of data as part of the evaluation system, the lack of reliable data about professional learning really drove home for them the disconnect between the larger goals of educator evaluation as a vehicle for providing support and development and the types of data available to investigate these goals.
As one aspect of the resulting study, REL Northeast & Islands staff worked with the district to develop a survey they could administer that would gather more information about the range of activities in which teachers engage to address areas of need. The data from the survey served as a critical element of “Teacher Evaluation and Professional Learning: Lessons from Early Implementation in a Large Urban District,” in which we investigated the feedback teachers received and whether the subsequent actions they took to improve their practice aligned with it. By working collaboratively with our colleagues from the beginning, we were able to identify data to address their questions, uncover gaps in the data, and build the district’s capacity to explore priority issues.
One of the most exciting things about this study, and the relationship we’ve built with our colleagues in the district, is how much it is influencing the district’s evaluation and professional support systems moving forward. In response to the findings, the district has revised its training of evaluators to provide more guidance about the nature of good written feedback and how to link it to existing professional supports. The district has also increased the alignment of its professional development offerings with the standards and elements of the evaluation rubric.
By sharing the study as it unfolded, NEERA members have had the opportunity to reflect on data challenges and consider preliminary findings and the implications of these findings. After one session, a colleague at the state level requested a meeting between the district, REL researchers, and his own state educator effectiveness team to learn more about the data the district collects, the needs the district has related to ongoing support, and the approach the district is taking to further align evaluation and professional learning. The state team's intention was to incorporate lessons from the district’s work into the guidance they provide to districts across the state.
- Read about the study and its key findings, and browse all of the REL Northeast & Islands' publications.
- Check out a news story about the REL Northeast & Islands: "NH Commissioner: REL Northeast & Islands at EDC "Completely Elevated" Manchester’s Thinking about Education."
- Read publications co-authored by Karen: Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit; Redesigning Teacher Evaluation: Lessons from a Pilot Implementation; and An Examination of Performance-based Teacher Evaluation Systems in Five States.
- Follow the REL Northeast & Islands’ Twitter feed, subscribe to its newsletter, and explore its YouTube channel.
- Watch some of Karen’s past webinars: How to Facilitate the Logic Models Workshop for Program Design and Evaluation and Leveraging Educator Evaluation to Transform Teaching and Learning.
- View our researchers' sessions and symposia at the 2016 Annnual Meeting of the American Educational Research Association (AERA).
- Learn more about all of our research and evaluation.