By: Rachel Brown, Ph.D., NCSP
As students and teachers return to school after the holiday break, there is an opportunity to review each student’s progress toward year-end learning goals and determine if adjustments are needed. As noted, in a prior blog, FAST™ assessments are designed to be used as part of a problem-solving process. For many students, the combination of core instruction plus intervention will result in the gains necessary to meet grade level learning goals. Still, there are some students who continue to struggle despite efforts to support their learning needs. This blog will focus on in-depth problem solving for students whose winter screening scores and progress data indicate they are very unlikely to meet the year-end goals. Many of the ideas included here are from the National Center for Intensive Intervention (NCII), an online technical assistance center funded by the U.S. Department of Education. In particular, the NCII recommends the use of data-based individualization (DBI) to support students whose learning difficulties persist despite prior intervention. Three main considerations reviewed here are: (a) what is in place now? (b) what options have been tried? and (c) what other options could be tried?
First, the team should review what combination of intervention and progress monitoring is currently in place. Important questions are:
- Does the intervention match the skill need?
- Is the progress measure aligned with the intervention area?
- Was the intervention implemented correctly?
- Consider student attendance
- Was the progress monitoring conducted correctly?
If the answer to any of the above questions is “no” then the next step is to make adjustments to the current intervention, collect more data and review the data at a specific future date. An important reminder about intervention accuracy relates to student attendance. If the student has frequent absences, the potential benefits of the intervention might not be observed. In order to conclude that an intervention does or does not work for a student, it must be provided at the frequency and duration intended (and hopefully documented as effective in research). When students are absent, it is similar to a patient not taking the right “dose” of medication. The possible and expected effects from instruction will not be the same if the student is there only part of the time. For students with frequent absences, it is best to maintain the intervention and continue to collect data until attendance is adequate to justify reviewing the progress data. When the answers to all of the above questions are “yes” then it is safe to conclude that the current intervention is probably not the right answer for this student and to engage in more problem solving.
Before moving on to any new and different interventions, it is important to find out what, if any, prior interventions have been tried. Sometimes this will be easy because the team will have the records easily available. In other cases, it might not be as easy. Nonetheless, it is a good idea to try to learn what other efforts have been used in the past so that time is not wasted on efforts that were previously ruled out. Sources of information about prior interventions can include the student’s cumulative school record, online databases with prior year grades and progress reports, previous teachers, and the parents. In order to make the task of gathering information about prior interventions less cumbersome, team members can each agree to locate one source of information and then share it at the next meeting. Ideally, the next meeting will be within a week or two so that problem solving for this student can continue. With the information about prior interventions in hand, the team can consider what the next intervention could be.
With information about the most recent and prior interventions in hand, the team can continue the problem-solving process. The five problem solving steps are shown below.
In many ways, the team will need to work through each step of the process from the beginning, since the prior efforts did not work. Specifically, based on data from the last intervention, Plan Evaluation outcomes showed that the current plan is not working. That means returning to the Problem Identification stage again. This step can be completed quickly because of the evidence that the student continues to struggle. As noted by the NCII, an additional consideration when teams are engaged in renewed problem solving is whether more assessment data are needed. In the first iteration of problem solving the team typically has screening scores and data from other indicators such as district or state tests and classroom assessments. In this next iteration of problem solving, it might be helpful to conduct additional diagnostic assessments in order to be certain that the team is focusing on the correct “problem”. Considering diagnostic assessment is part of Problem Analysis but could be essential for Plan Development. Diagnostic assessments could be brief or extensive, depending on the student’s presenting needs. Brief academic assessments could be conducted by a special education teacher. More in-depth assessments might require the school psychologist or another specialist such as the speech and language pathologist. In many districts there are important rules related to when and how diagnostic assessments can be used, and it is important to know these rules. In addition, such assessment might be required if the student is formally referred for special education services. The NCII recommends using selected diagnostic data as part of the process to adapt and improve interventions. The role of diagnostic data is to fill in gaps about student learning performance so that the team has a complete understanding of what the student does and does not know. Whether with or without diagnostic data, Problem Analysis leads to Plan Development.
The new plan should include an adapted or different intervention as well as a progress measure. In addition, the plan should specify the frequency and duration of intervention sessions and progress monitoring. Students for whom data-based individualization is required typically need intensive intervention and frequent monitoring. Both the NCII and FastBridge Learning® recommend weekly progress monitoring for students participating in intensive intervention. A final important detail for the revised plan is the date when the student’s progress data will be reviewed again. Most of the time, interventions require 9 to 12 data points to show effects. That said, it is important to look at the data more often to be sure that the intervention is having some positive effects. For this reason, reviewing data every 3 to 4 weeks is recommended.
Most, but not all, students who participate in effective core instruction coupled with supplemental (Tier 2) intervention will make gains toward year-end learning goals. Those who do not, might require additional problem solving using methods like DBI. The team will want to examine carefully all of the details of the current intervention and be sure it has been implemented correctly. When this is the case, the team is encouraged to consider information about prior efforts to help the student in order to prevent repeating ineffective practices. Finally, the team should determine if more data about specific aspects of the student’s skills are needed in order to understand the problem accurately. The next steps are to develop and implement a new plan that includes intensive intervention and frequent progress monitoring. For more information about DBI and intensive intervention, visit the NCII website at: https://intensiveintervention.org/