Observational Tool for Instructional Supports and Systems: Empowering Teachers as Instructional Leaders

October 29, 2021

When teachers are viewed as leaders and co-creators of the needed support systems to use non-evaluative classroom observation data to strengthen instructional practice, the result is a commitment to and ownership of a scalable and sustainable process (Ryan Jackson, Bailey, Dilts-Pollack, Williams, & Waldroup, 2021; Hattie & Zierer, 2017; Ryan Jackson et al. 2018, 2021). 

The research is clear – the quality of teachers’ instructional practice in the classroom is the largest factor in closing long standing deficits and disparities in student outcomes (Harper, 2019; Hattie, 2003). Yet, few evidence-based practices and programs include a measure of instructional practice that is non-evaluative and provides prompt usable feedback for teacher teams to create their school-wide goals for instructional improvement (Allor & Stokes, 2017). To address this need, our team, in collaboration with several research and education partners, created the Observation Tool for Instructional Supports and Systems (OTISS).

A Brief History and Description of the OTISS

Why was the OTISS developed and what is the purpose?"You really meant it is the system and not me." -Teacher

The OTISS was developed in response to the need for a research-based observation measure of instruction that could be used with any practice, program, curriculum, or grade level. It was designed specifically to use when the selected evidence-based practice or program does not have a measure of fidelity or integrity. It was purposefully not designed for evaluation. Rather, we recommend using the OTISS to assess the quality of support systems needed to enhance teacher practice. For example, if school aggregated OTISS data are low on the OTISS item, “provides prompt and accurate feedback”, then training and coaching supports are developed or strengthened to increase teacher skill and application of equitable feedback strategies in the classroom, so all students benefit.

What is the research basis of the OTISS?

The OTISS is comprised of seven items derived from Hattie’s (2009) findings on factors that influence academic achievement. Inclusion criteria for OTISS items included teacher behaviors that would happen at any time during instruction (not time specific), would occur consistently during instruction (opportunity to be seen during a 10-minute observation), would happen at any grade level or content area (not content or grade-level specific), and were correlated with improving student achievement. The seven items in the OTISS have demonstrated a medium to high effect size (ES > .40: Hattie, 2009). 

How is OTISS data collected?

Observers complete 10-minute observations and score each OTISS item as 2- fully observed, 1- partially observed, 0- not observed, or NA- no opportunity to observe. Each school co-creates the process for collection and use of the data that is entered and stored on the SISEP.org data platform for use by teams monthly. The school and the teachers determine how the data will be used. Some only use aggregate data to set school-wide goals. Other schools who have instructional coaches with trust and confidential working relationships with teachers may have requests from a teacher to share individual teacher data so the coach and the teacher, in confidence, can work on individual goals. Again, the OTISS was not designed for evaluation. All observers are trained and obtain inter-observer agreement to ensure unbiased observations and reliable data. We recommend that observers calibrate their scores at least once a year to ensure observations are consistent from classroom to classroom, school to school, and district to district.

How do teachers use the OTISS data?

Teacher teams, with the support of a coach, use aggregate data to identify their school-wide goals for improvement.  A common process is for teachers to use videos of instruction (examples and non-examples) to study effective instructional practice using lesson study methodology (Murata, 2011). For example, teachers use an iterative process to design a lesson together, analyze effectiveness of delivery, and make needed improvements in the design, and then cycle again. As teachers improve instruction, the variance in instructional practice from teacher to teacher should be reduced, resulting in equitable access to effective instruction by students from teacher to teacher and school to school. 

How do implementation teams use the OTISS data?

Teams at the school and district level (inclusive of teacher, coach, specialist, and leaders with decision making authority who champion the work) analyze data monthly to identify a problem and solution, set a goal and measure of effectiveness, and develop a plan and timeline that identifies the people responsible for actionable items. They use an effective team meeting protocol in Plan-Do-Study-Act Cycles and easy to access implementation and outcome data; see an example from Kentucky. Using a linked teaming structure, teacher teams identify barriers (e.g., need for increased funding for coaches to provide modeling in the classroom) and school and district teams deliver the supports and resources required by school staff and teachers (e.g., guaranteed, protected time for coaching).

“The lack of predicted outcomes in large scale studies point to lack of attention to providing implementation supports for fidelity of the use of innovations as the problem… Fidelity matters when attempting to improve outcomes and the investment must be made in Implementation Teams who have the expertise to improve fidelity and, therefore, outcomes” (Fixsen, Van Dyke, & Blase, 2019, p. 10 & 11).

Let’s look further at what we are hearing and learning from our users of the OTISS.

Examples from Practice"Implementation with fidelity makes change more sustainable... We are beginning to get data linking outcomes to use of math instruction with fidelity." -Teachers and Staff

Leaders in the Commonwealth of Kentucky are mission driven. Their focus is to evaluate the support systems provided to teachers to facilitate implementation of practices with a high degree of fidelity, improve outcomes, and close persistent education gaps (Ryan Jackson & Waldroup, 2018). To accomplish this goal they started small, in four diverse district contexts (e.g., large urban, suburban, and small rural) to develop buy-in for the collection and use of instructional observation data for improvement using the Observational Tool for Instructional Supports and Systems (OTISS; Fixsen et al, 2020). Their purpose for adopting use of the OTISS was two-fold: 

  1. So that teachers have non-evaluative observation data to use and identify their school-wide goals for improvement, and
  2. so that implementation teams have data to use in Plan-Do-Study-Act cycles to critically evaluate their systems of training and coaching teachers identify as critical to meeting their schoolwide instructional goals.

Listen to the SISEP podcast episodes on the OTISS to learn how teachers are supported to use OTISS for continuous improvement in the Commonwealth of Kentucky. Teachers, coaches, and executive leaders in the Commonwealth share specific examples of how they use the OTISS and Kentucky’s mathematics observation measure. The podcast series provides exemplars of how to close the research to practice gap and how Kentucky integrates the sciences of implementation and improvement. The National Implementation Research Network and Kentucky were honored by the Carnegie Foundation for Quality in Continuous Improvement in 2019.


Related Resources

Fidelity Assessment: This brief on fidelity defines fidelity assessment as indicators of doing what is intended. This definition requires a) knowing what is intended, and b) having some way of knowing the extent to which a person did what was intended. When evidence-based approaches or other effective innovations are being used in education, fidelity assessments measure the presence and strength of an innovation as it is used in daily practice.

Communication Protocol and Linking Teams: AI Hub Lesson: Communication Protocol - Linking Teams describes, to be effective and include all appropriate levels, communication must be strategically planned and consciously monitored. This lesson introduces you to a tool for creating a plan for communication between linked teams.

The PDSA Cycle: AI Hub Lesson 6 describes how to employ and document the key components in each Plan-Do-Study-Act (PDSA) Cycle phase and identify the importance of utilizing iterative PDSA cycles.
 


References

Allor, J., & Stokes, L. (2017). Measuring treatment fidelity with reliability and validity across a program of intervention research: Practical and theoretical considerations. In V. Roberts, S. Vaughn, S. N. Beretvas, & V. Wong (Eds.), Treatment fidelity in studies of educational intervention (pp. 138-155). Routledge.

Fixsen, D. L., Van Dyke, M., & Blase, K. A. (2019). Implementation science: Fidelity predictions and outcomes. Active Implementation Research Network. www.activeimplementation.org/resources.

Fixsen, D. L., Ward, C. S., Ryan Jackson, K., & Chaparro, E. (2020). Observation Tool for Instructional Supports and Systems (OTISS): Walk through and observation form. State Implementation and Scaling-up of Evidence-based Practices Center, National Implementation Research Network, University of North Carolina at Chapel Hill. 

Harper, F. K. (2019). A qualitative metasynthesis of teaching mathematics for social justice in action: Pitfalls and promises of practice. Journal for Research in Mathematics Education, 50(3), 268–310. https://doi.org/10.5951/jresematheduc.50.3.0268

Hattie, J. (2009). Visible Learning. Routledge.

Hattie, J., & Zierer, K. (2017). 10 Mindframes for visible learning: Teaching for success. Routledge.

McColskey-Leary, C. (2021, September 29). Implementation champions as a strategy to build capacity. State Implementation and Scaling-Up of Evidence-Based Practices Center.

Murata, A. (2011). Introduction: Conceptual overview of lesson study. In L. C. Hart, A. S. Alston, & A. Murata (Eds.), Lesson study research and practice in mathematics education (pp. 1-12). Springer.

SISEP Center (2021, January 30). Equity in implementation: Leveraging initial implementation to challenge systemic inequities [eNote]. State Implementation and Scaling-Up of Evidence-Based Practices Center.

Ryan Jackson, K., Fixsen, D., Ward, C., Waldroup, A., & Sullivan, V. (2018). Accomplishing effective and durable change to support improved student outcomes [White Paper]. National Implementation Research Network, University of North Carolina at Chapel Hill.

Ryan Jackson, K., Gau, J., Smolkowski, K., & Ward, C. (2021). Improved mathematics outcomes using active implementation: Kentucky’s effective and durable change [Brief]. National Implementation Research Network, University of North Carolina at Chapel Hill.

Ryan Jackson, K., Bailey, D., Dilts-Pollack, A., Williams, D., & Waldroup, A. (2021, October 19). Utilizing OTISS as a fidelity measurement (No. 16) [Audio podcast episode]. In Implementation Science for Educators. State Implementation and Scaling-Up of Evidence-Based Practices Center. https://anchor.fm/sisep-center/episodes/Tip-16-Utilizing-OTISS-as-a-Fide...

Ryan Jackson, K.M., & Waldroup, A. (2018). Improving by bringing together improvement and implementation science. Spotlight on Quality in Continuous Improvement, Carnegie Foundation, Washington D.C. https://www.carnegiefoundation.org/wp-content/uploads/2018/12/Carnegie_S...