By: Jessie Kember, Ph.D.
Collecting screening data on three to four occasions throughout the school year, in addition to more frequent progress monitoring, allows educators to evaluate growth that those students involved in intervention have made throughout the school year (and since intervention implementation). While a fall screening score informs initial intervention group placement, winter and spring screening scores can provide educators with information regarding whether the intervention placement is both appropriate and effective, allowing for data-informed decisions.
Similar to progress monitoring data which is evaluated and interpreted using pre-determined decision rules, screening data are also interpreted using established benchmark and normative scores. Despite this similarity, it is important to keep in mind that although screening always occurs at grade level, progress monitoring occurs at the student’s instructional level, which may or may not be at grade level. Therefore, in order to fully understand a student’s current level of functioning and progress throughout the course of the school year, it is important to view both sources of data side-by-side. FAST™ provides a convenient report with this purpose in mind: the Student-at-a-Glance Report. The following blog will unpack examples of de-identified student data through the use of the Student-at-a-Glance report function in order to emphasize the importance and utility of comparing screening data with a student’s progress monitoring data.
To access the Student-at-a-Glance Report, there are two options:
- From the Home Page, select “Class Lists.” On the Class List page, each student’s name is a hyperlink that can be clicked to display the Student at a Glance Report.
- Select “Reporting,” then “Student at a Glance Report.” From there, you may search for a specific student by name.
The Student-at-a-Glance Report report provides both screening assessment results as well as progress monitoring results (if applicable for the student) across assessments, screening periods, and school years. The following data are for a fifth grade student, Julia, with scores on CBMreading over the last three years.
Data from the 2017-2018 school year are summarized on the right hand side of the above screen shot (shaded in light orange). From these data, we can conclude that as a fifth grade student during the 2017-2018 school year, Julia received the following scores on the CBMreading screening measure in fall, winter, and spring, respectively: 209 (above the 85th percentile), 113 (21st-30th percentile), and 221 (above the 85th percentile). Overall, Julia’s screening scores were not consistently at-risk. Selecting “Graph” next to the screening measure name (CBMreading) provides further detail in an Individual Benchmark Report. This report includes school, district and national percentiles, growth percentiles for school, district, and national levels, benchmark cut off scores, and risk levels for the student across school years. Notably Julia’s winter score of 113 was within the high-risk range. Below Julia’s screening results on the Student-at-a-Glance Report report, progress monitoring results for the 2016-2017 (4th grade) and 2017-2018 (5th grade) school year are summarized.
At the bottom right corner of the preview, selecting “view details” opens the complete progress monitoring graph as well as individual scores, dates on which scores were collected, errors, notes, and interventions.
Examining Julia’s graphed scores allows us to observe patterns in the data. From this graph, we can identify the start date and duration of Julia’s intervention (and progress monitoring): 10/6/2017-1/8/2018. We can conclude from Julia’s nine data points, that Julia’s trend line is in a slightly downward direction. Below the graph, additional details are provided, along with visual conventions to aid with interpretation. This information includes Julia’s total trend (-0.39), as well as her goal line (1.59), the dashed blue line that goes from the starting point to Julia’s end of the year goal.
Some of Julia’s progress monitoring data are shown below.
From the screenshot above, we can see the dates on which progress monitoring scores were collected, the intervention delivered (Read 180), the interventionist, the scores obtained, the individual responsible for collecting the data, whether the student’s scores were above, at, or below Julia’s goal, items that Julia incorrectly answered, and intervention notes. Overall, Julia performed at (within +/- 10% of the goal) or above her goal. However, later in the intervention, Julia performed below her goal (symbolized by the downward red arrow).
This progress monitoring data during December and January mirrored Julia’s winter screening CBMreading score of 113 (21st-30th percentile). Despite these data points, Julia returned to a low-risk profile by the spring screening period, scoring above the 85th percentile (221).
In many cases, comparing screening and progress monitoring data is only one piece of a complex puzzle, as other sources of information provide a more comprehensive portrait of Julia’s overall reading achievement during the school year. For example, other sources of data may include classroom and intervention observations, attendance data, and scores from other standardized tests administered throughout the school year (i.e., state-wide testing results). For Julia, it may also be important to review data from previously delivered interventions. For example, from the Student-at-a-Glance Report, we can see that Julia also completed a reading intervention during the 2016-2017 school year as a fourth grade student .
As a fourth grade student, Julia received the following CBMreading screening scores for fall, winter, and spring, respectively: 108 (31st-85th percentile), 118 (21st-30th percentile), and 122 (20th percentile and below). Similar to her fifth grade screening scores, Julia’s screening scores did not consistently place her in the at-risk range. Julia’s screening data as a third grade student in the spring of 2016 (94; 20th percentile and below) likely impacted her reading fluency intervention group placement beginning in October as a fourth grade student. Her 2016-2017 progress monitoring data are shown below.
Julia participated in intervention between October 17th of 2016, and May 19th of 2017. Throughout this time period, her progress monitoring data indicated performance below her goal consistently across weeks. Although Julia showed growth (total trend: 1.19), this growth was not sufficient and was below her goal line (1.65). Unfortunately, her spring screening score (122) below the 20th percentile mirrored this slow growth evidenced by her progress monitoring data. In looking at Julia’s data over time, it appears that she struggled in third grade, received intervention in fourth grade and in the first half of fifth grade, and ended her fifth grade year above the benchmark goal.
Overall, it is important to view multiple sources of data, rather than interpreting data as isolated silos. For example, rather than reviewing progress monitoring data in isolation, it is beneficial to review these data points alongside a student’s screening data, as well as other relevant data. Most importantly, any decisions about intervention placement, changing, or continuing an intervention need to be based on multiple sources of data, including information from a student’s performance on screening and/or progress monitoring measures. Together, screening and progress monitoring data can allow for data-informed decisions.