Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Vermont Model for Continuous Improvement: Conducting PDSA Cycles for Educational Change, Schemes and Mind Maps of Literature

The Vermont model for continuous improvement involves planning, testing, and analyzing changes for educational improvement using PDSA cycles. guidance on conducting PDSA cycles, including data collection, testing changes, and deciding to adopt, adapt, or abandon ideas. The PDSA Toolkit is recommended for schools and school systems engaged in this process.

Typology: Schemes and Mind Maps

2021/2022

Uploaded on 09/27/2022

larryp
larryp 🇺🇸

4.8

(34)

353 documents

1 / 20

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Plan-Do-Study-Act (PDSA) Toolkit
A Resource for Schools entering the testing phase of the
Continuous Improvement Process
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14

Partial preview of the text

Download Vermont Model for Continuous Improvement: Conducting PDSA Cycles for Educational Change and more Schemes and Mind Maps Literature in PDF only on Docsity!

Plan-Do-Study-Act (PDSA) Toolkit

A Resource for Schools entering the testing phase of the

Continuous Improvement Process

PDSA Toolkit

 - Page 2 of 
  • What is Plan-Do-Study-Act (PDSA)? Table of Contents
  • The Plan-Do-Study-Act (PDSA) Worksheet
  • Diagram of a Plan-Do-Study-Act Cycle....................................................................................
  • Working Theory of Improvement
  • PDSA Quick Tips and Reminders
  • Suggested Roles and Responsibilities
  • Before You Begin a PDSA Cycle
  • STEP 1- PLAN..........................................................................................................................
  • STEP 2- DO
  • STEP 3- STUDY........................................................................................................................
  • STEP 4- ACT
  • References
    • Glossary
    • Appendix- PDSA Example 1- Apple Orchard Elementary PDSA Worksheet

PDSA Toolkit Page 4 of 20

Plan-Do-Study-Act Worksheet

PLAN:

Briefly describe the test:

How will you know that this change idea is an improvement?

What do you predict will happen?

PLAN- What, Who, When, Where

List the tasks necessary to complete this test (What)

Person responsible (Who)

When Where

PLAN- Data Collection

Type of Data What data will be collected and what tool will be used for the measurement? Process Measures - Measures how well a change idea is implemented.

Outcome Measures - Measures if the change idea achieved its goal.

School: Test Date and Timeframe:

Prioritized SMART Goal:

Change idea to test:

PDSA Toolkit Page 5 of 20

DO:

( Upload data or link to data in this section ) Test the changes. Collect the data for:

  • Process Measures
  • Outcome Measures
  • Balancing Measures- What were the unintended consequences for implementing the change idea?

STUDY:

Was the cycle carried out as planned? What happened during the testing phase?

What did you observe that was surprising?

What were the results? Did the results match your prediction(s)?

What did you learn?

ACT:

Decide to Adopt, Adapt, or Abandon

□ Adopt: Select changes to implement on a larger scale, develop an implementation plan, and

plan for sustainability

□ Adapt: Improve the change and continue testing plan. What plans/changes are you going to

make for your next test?

□ Abandon: Discard this change idea and try a different one.

PDSA Toolkit Page 7 of 20

Working Theory of Improvement

While conducting a comprehensive needs assessment, during phase 1 of the continuous improvement process, your team developed a working theory of improvement and driver diagram. A Working Theory of Improvement describes the structures and processes that the team believes need to be changed to meet an improvement goal, as well as, specific actions to create these changes (Provost & Bennett, 2015).

The Driver Diagram is a method for organizing your Theory of Improvement and can be completed using the information collected during the comprehensive needs assessment process. It becomes a record of learning and a roadmap for intervention. Theories can change based on testing each change idea and learning from the experiences.

A driver diagram shows the relationship between the overall SMART goal of your improvement project, the primary drivers that directly relate to achieving the goal, the secondary drivers that are components of the primary drivers, and specific change ideas to test for each secondary driver. (IHI QI Essential Toolkit: Driver Diagram, 2017)

The driver diagram is where you will determine which change ideas to test as you complete PDSA cycles you will update the driver diagram accordingly based on the data collected and decisions made about how to move forward (e.g. adopt, adapt, or abandon).

PDSA Toolkit Page 8 of 20

PDSA Quick Tips and Reminders

  • PDSA cycles during the testing phase are rapid, and iterative. Not all change ideas warrant this level of detailed experimentation.
  • Change ideas need to be focused and measurable.
  • Collect data that directly measures the impact of the change idea.
  • Data can be qualitative and/or quantitative
  • Measure the progress and outcome of the PDSA using a variety of data that is directly aligned with the goal and change idea, such as: o checklists or rubrics o surveys o observations o evaluations o classwork, homework, quizzes, tests, projects o state interim assessments

Suggested Roles and Responsibilities

Those closest to implementing the change idea for each PDSA should be involved with each step of the PDSA process.

Teachers and Coaches:

  • Provide input on change ideas to test
  • Participate in the development and implementation of the test
  • Collect data and participate in the analysis and study of the data

Principals and Coaches:

  • Provide input on the change ideas to test
  • Lead/participate in the development of the test
  • Supervise the implementation of the test
  • Lead/participate in the analysis and study of the data
  • Collaborate on next step decision making
  • Coordinate between school and Central Office

Curriculum Directors/Superintendents:

  • Manage the development and implementation of the PDSA cycles

PDSA Toolkit Page 10 of 20

Typically, Change Ideas originate from:

  1. Research Knowledge: What does the literature say about solving this problem?
  2. Practice Knowledge: What have other colleagues done to solve this problem?

3) Design/Creative Thinking: In what new ways might we address this problem?

What change ideas should you test for a PDSA?

PDSA cycles conducted during the testing phase of the continuous improvement process are for the purpose of testing small scale changes to build confidence in their efficacy prior to full implementation and scale. PDSA cycles conducted during the implementation phase are for the purpose of fully implementing the agreed upon changes across contexts (once confidence in their efficacy is built during the testing phase). ) There is NO expectation regarding the number of PDSAs to complete or due dates for completion, as it is determined by local data and context.

Determining an appropriate “grain size” for the change idea being tested is important. An appropriate testable grain size relies on determining specific actions and behaviors, is measurable, and can be easily replicated by more than one person at a time with consistency. Change ideas that are too small can waste time and resources. Change ideas that are too big are difficult to test efficiently and effectively.

What change ideas should you PDSA?

NO YES

A one-time professional development workshop or course.

Testing and implementing instructional strategies learned at a professional development workshop or course. An instructional coach Coaching cycles to help teachers implement or improve a specific instructional strategy. Trauma-informed teaching Testing the lesson planning protocols developed by the instructional leadership team and behavior specialist team e.g., the specific instructional practice/intervention…

PDSA Toolkit Page 11 of 20

STEP 1- PLAN

The first step of the PDSA cycle is to make a Plan by assigning tasks, roles, and due dates. In this step you and your team will also make a prediction(s) about what you think will happen by implementing the change idea and determine how you will measure the success of the change idea both while you are conducting the test and after the test is complete.

PLAN- Describe the test and make predictions

Briefly describe the test: Summarize what your change idea is and how you plan to test it. How will you know that this change idea is an improvement? Describe the process measures, outcome measures, and tools you will use to determine whether the change idea tested was an improvement. Sentence Starter: We will know that this change idea is an improvement because, teachers will…. students will…. What do you predict will happen? Write down your prediction or predictions about what teacher and student actions or behaviors you believe will happen or hope to see by implementing this change idea. The predictions need to be measurable and observable to determine whether they were met at the end of the test. Sentence Starter: We predict that teachers will……. We predict that students will……

PLAN- What, Who, When, Where

Using the table below map out the necessary tasks needed to complete the test, identify who is responsible for completing the task, when the task should happen, and where. Be sure to communicate this plan to everyone involved in testing the change idea. List the tasks necessary to complete this test (What)

Person responsible (Who)

When Where

Collecting Data

To determine the effectiveness of a change idea, it is important to identify methods to assess progress and monitor for unintended consequences along the way. Two measurement types can be used to maximize the effectiveness and efficiency of your team’s continuous improvement process.

  • Process Measures are used to determine whether the successful implementation of a change idea is occurring before outcomes are known. These strategies can be monitored formatively and approaches to change can be revised quickly (IHI, 2017).
  • Outcome Measures measure the intended result of your change idea. o Leading Outcome Measures: short-term formative or summative assessments (ex., local assessment data, checklists, rubrics)

PDSA Toolkit Page 13 of 20

STEP 2- DO

In the Do phase of PDSA you will carry out the test as planned and collect the data identified in the plan phase. Be sure to pay attention and make note of any unexpected or unintended results that arise from testing the change idea.

DO

( Upload data or link to data in this section ) Test the changes. Collect the data for:

  • Process Measures Include the raw data (tables, charts) or a link to the raw data.
  • Outcome Measures Include the raw data (tables, charts) or a link to the raw data.
  • Balancing Measures- What were the unintended consequences for implementing the change idea? Include any unexpected or surprising data that may have occurred as a result of testing the change idea.

STEP 3- STUDY

In the Study phase of PDSA you will analyze the data you collected during the Do phase. We recommend using a protocol for analyzing and synthesizing data with your team. Below is a protocol you can use. You can use any from the links below or you develop your own protocol.

Collaborative Study for Continuous Improvement Protocol

Before you begin:

  • Identify three roles: Facilitator, Time Keeper, and Note-Taker
  • Facilitation Tips: Let the data speak by observing and noticing what you see before making any assumptions or inferences about what it means. Be sure to ask the quiet person what they think.
  • Depending on where the group is in the process, using the protocol may be awkward and uncomfortable, especially the first several times it is used.
    1. REVIEW the continuous improvement plan (3-5 minutes)
    2. PREDICT what you believe the data will reveal (2-5 minutes)
    3. EXAMINE the data independently (10 minutes)
    4. ASK clarifying questions about the data (5 minutes)
    5. OBSERVE what you see in the data without judgement or interpretation (10- minutes)
    6. INTERPRET/INFER what the data reveals (10-15 minutes)
    7. IDENTIFY lesson learned (5-10 minutes)

Links to data analysis protocols:

  • SRI Protocol- ATLAS Looking at Data
  • SRI Protocol- Data Mining Protocol
  • SRI Protocol- Looking at Data Sets A Collaborative Inquiry and Problem-Solving Protocol
  • SRI Protocol- Data Driven Dialogue

PDSA Toolkit Page 14 of 20

  • Oakland Unified School District- Data Protocols
  • Data Wise Process and Free Online Course

STUDY

Was the cycle carried out as planned? What happened during the testing phase? Explain what happened during the testing phase and whether the test was carried out as planned or if changes were made and why. Sometimes people are not on the same page about their roles and responsibilities and it is discovered during or after the test that different people were conducting the test differently. Include that information in this section. What did you observe that was surprising? Based on your balancing measures (see DO Section) what unexpected results, if any, presented themselves while testing the change idea? What were the results? Did the results match your prediction(s)? State the results of the data analysis and explain how they relate to your prediction(s). What did you learn? Discuss any reflections the group had about the process, what worked well and why, what did not work and why, realizations and a-ha moments while conducting the test, or any other lessons learned from testing the change idea. This reflection will help determine which direction to take in the next step, Act.

STEP 4- ACT

In the Act phase of PDSA the team will decide whether to adopt the change idea based on the data analysis conducted in the Study phase, abandon the change idea, or adapt the change idea and continue testing.

ACT

Decide to Adopt, Adapt, or Abandon

□ Adopt: Select changes to test on a larger scale, develop an implementation plan, and plan for

sustainability. Discuss the implementation plan that will be used for broadening the scale of the change

idea to ensure that is done with fidelity.

□ Adapt: Modify the change and continue testing plan. What plans/changes are you going to make for your next test? Using the data analysis from the study phase determine what changes and improvements the team can make to the initial idea and outline a plan for how the team can test this new adapted change idea.

□ Abandon: discard this change idea and try a different one. Explain the reasoning behind

abandoning this change idea. Choose a new change idea to test from the driver diagram and explain the

decision behind choosing the new change idea.

PDSA Toolkit Page 16 of 20

Glossary

Balancing Measure: Used to test for unintended consequences of improvements (IHI, 2017).

Baseline Data: The initial performance data taken on a student; often the median score of three baseline data points or perception/survey data. The baseline serves as the reference point for all future data collection.

Benchmark (Periodic/Interim) Assessments: Assessments used to gather data several times a year and monitor students’ progress with respect to expected (benchmark) performance, over time.

Benchmarks : Content or developmental standards (levels, cut scores, targets, etc.) that describe sequences of growth that can be monitored over time. Usually measured three times per year (fall, winter, spring).

Change Idea: Evidence based actions for improvement that are related directly to secondary drivers and are intended to have positive outcomes toward meeting the goal.

Comprehensive Needs Assessment (CNA) : A formal process for determining gaps between current conditions and desired outcomes. Needs assessments are used to identify goals for continuous improvement.

Continuous Improvement: An ongoing process of improving school practice based on assessed needs and informed by data. Often this process includes rapid learning cycles / Plan- Do-Study-Act Cycles.

Data-Based Decision Making: The ongoing process of analyzing and evaluating student data to inform educational decisions, including, but not limited to, approaches in instruction, intervention, allocation of resources, development of policy, movement within a multi-level system, and disability identification.

Driver: The various components of the system believed to have the greatest influence on your problem/goal.

Driver Diagram: The Driver Diagram is a method for organizing your Theory of Improvement and can be completed using the information collected during the comprehensive needs assessment process, becoming a record of learning and a roadmap for intervention. A driver diagram shows the relationship between the overall SMART goal of your improvement project, the primary drivers that directly relate to achieving the goal, the secondary drivers that are components of the primary drivers, and specific change ideas to test for each secondary driver. (IHI QI Essential Toolkit: Driver Diagram, 2017)

Improvement Science: The science of determining which improvement strategies work best, based strongly on evidence. http://www.carnegiefoundation.org/our-ideas/

PDSA Toolkit Page 17 of 20

Local Educational Agency (LEA): Districts and Supervisory Unions

Outcomes/Summative Assessment: Assessments that help teachers to evaluate and verify learning over time and may aid teachers in planning future instruction, informing classroom decisions (i.e. potential use of groupings), evaluating curricular changes, and making school wide decisions regarding curriculum and instruction.

Outcome Measure: The measure of the intended result of your change idea.

Primary Driver: Broad areas and components of the system that have the greatest influence on the problem/goal.

Process Measure: The measure used to determine whether the successful implementation of a change idea is occurring before outcomes are known. These strategies can be monitored formatively and approaches to change can be revised quickly (IHI, 2017).

Progress Monitoring (see also Benchmark and Formative): Data used to frequently check student progress towards success. Progress monitoring is used to assess students’ academic or behavioral performance and evaluate the effectiveness of instruction. Progress monitoring procedures can be used with individual students or an entire class.

Secondary Driver: Specific practices or components within identified primary drivers that influence a problem/goal.

SMART Goal: Goals for improvement should be specific, measurable, attainable, realistic, and timebound describing what will be improved, by how much, by when, and for what/whom.

Theory of Improvement: A plan outlining actions necessary to achieve desired changes to reach your goal. It is usually written as an “If-Then” statement and/or displayed in a driver diagram. A Theory of Improvement describes the structures and processes that the team believes need to be changed in order to meet an improvement goal, as well as, specific actions to create these changes (Provost & Bennett, 2015).

PDSA Toolkit Page 19 of 20

  1. Create Teacher Perceptions Survey Principal and Math Teachers

9.6.18 Conference Room

PLAN- Data Collection Type of Data What data will be collected and what tool will be used for the measurement? Process Measures - Measures how well a change practice is implemented.

  1. Teachers will record student responses in a checklist on the discussion template.
  2. Teachers will also take a survey about their experiences using the Math Closure Discussion Questions. Outcome Measures - Measures if the change practice achieved its aim.
  3. The frequency and quality of student response. MathWorks progress monitoring and Spring Benchmark; CLASS assessment

DO:

Test the changes. Collect the data.

Record data:

  • Process Measures- How well were the change practices implemented? Are the specific practices performing as planned? o Both teachers implemented the discussion protocol and recorded answers each day but one. o Took longer than anticipated to record answers.
  • Outcome Measures - How is the system performing? How are the students performing? What are the results? o Overall students increased the frequency with which they commented/answered questions. o Overall quality of student answers improved.
  • Balancing Measures- What did you observe that was not part of the plan? o More students would answer when they observed the teachers recording answers.

STUDY: Was the cycle carried out as planned? What happened during the testing phase?

  • There was a definite learning curve as the students learned they were supposed to explain what they had learned from the day and it was ok to agree or say what their neighbor said.
  • Some days students were very engaged and wanted to answer questions, other days they were not engaged at all.
  • One of the two teachers did not notice a pattern to student answers.

What did you observe that was surprising?

  • Even though there were days that students stated that they didn’t learn anything new, the teachers still stuck with the protocol and continued asking questions.
  • Students responded more when the noticed the teachers writing down their answers.

PDSA Toolkit Page 20 of 20

What were the results? Did the results match your prediction(s)?

  • The frequency of student responses increased, and teachers were better able to gauge their learning.
  • Quality of responses improved.
  • There was also an increased justification behind responses.
  • Yes, results matched predictions.

What did you learn?

  • The conversation needs to be modeled more and practiced more.
  • As students become more comfortable with the process, teachers expect to see more students answering.
  • As students were more comfortable, they became more comfortable disagreeing.

ACT:

Decide to Adopt, Adapt, or Abandon

X Adopt: Select changes to implement on a larger scale and develop and implementation plan and plan for sustainability.

□ Adapt: Improve the change and continue testing plan. What plans/changes are you going to make for your next test?

□ Abandon: discard this change idea and try a different one