The stakes are high when school districts assess students’ progress. Twice-yearly tests often don’t give teachers enough time to adjust their instruction for learners’ unique needs. A flawed test might give a child an inaccurate label that sticks for years.
A team of educational psychologists in CEHD led by associate professor Ted Christ has developed a suite of more accurate assessment tools that can be used more frequently. The Formative Assessment System for Teachers (FAST) is a suite of tools that includes Early Reading (earlyReading), Adaptive Reading (aReading), Curriculum
Based Measurement of oral Reading (CBMReading) along with emerging assessments for mathematics. FAST builds on the work of curriculum-based measurement (CBM) pioneer Stan Deno and computer adaptive testing pioneer David Weiss.
The focus of this article is CBMReading, a simple set of procedures that teachers use to identify and set goals for underperforming students.
“A lot of our work relates to maximizing the quality of the data and maximizing the quality of the decisions about students that people make with the data,” says Christ, who has been working on the effort for 15 years. “It all goes back to whether you can create reading passages that yield consistent performances over time.”
To assess elementary students with FAST CBMReading, teachers ask them to read out loud from a passage for one minute to estimate how many words they read per minute. Three times a year, they enter students’ scores into
the system, which has a user-friendly, color-coded interface that visually shows teachers which students have been tested, what they scored, and whether they are high-, low-, or no-risk. The program also allows teachers to set goals for individual skill-improvement plans. It offers charts that depict a student’s progress and how he or she compares to others.
The bulk of Christ and his team’s work has involved using the psycho- metric methods of field testing, linking and equating to develop sets of reading passages that are equivalent in difficulty. That way, variability is removed from the test, which helps teachers get consis- tent scores to better evaluate whether a student is making progress toward their goal.
“What wasn’t working was that passages produced widely variable performances, and someone had to make decisions based on those data,” Christ says. “What we’re trying to do is wrangle out that variability. We’re applying more rigorous methods to develop passage sets comprised of truly equivalent difficulties.”
Doug Marston (Ph.D., ’82), the Minneapolis Public Schools administra- tor for evaluation assessment in special education, says Christ’s research and updated assessment tools will help educators more effectively use database decision-making to identify students who need extra academic assistance. Using the response to intervention model, educators typically develop an evidence-based plan to help the student reach certain benchmarks. Schools then monitor how well the individual is responding to the intervention. Christ’s tools are especially effective because they give educators up-to-the-minute snap- shots of an individual’s progress, Marston says.
“A lot of this work has been done at the University of Minnesota, beginning with Stan Deno and many others in the educational psychology department,” says Marston. “Many of these procedures were first researched and implemented at the U of M going back to the 1970s and early ’80s. Ted’s work is really helping
to continue the good research on these approaches used for improved database decision-making.”
With a four-year grant from the Institute of Education Sciences, the research arm of the U.S. Department of Education, Christ’s team developed the FAST interface and wrote 360 reading passages—easy, middle, and hard sets— for first through sixth grades. They tested the passages on large groups of students in three different regions of the country. Eventually the team whittled down the number of passages to 80 by removing those that weren’t equal in difficulty. Now researchers from the U are monitoring students for 10- or 30-week periods and comparing the FAST results with results from other curriculum-based measurement systems.
So far, FAST has focused on reading. Eventually it will be expanded to writing, math, and spelling assessments. Ultimately, Christ wants to offer the tools to school districts nationally. FAST could save money and decrease the time teachers spend grading and logging scores from bubble-sheet tests.
“Other for-profit entities are distributing curriculum-based measurement tools and passages, and they charge $4 to $10 a child,” says Christ. “For a district with 10,000 students, that’s $40,000 to $100,000 a district would spend.
“We hope to distribute our tools— which will be better tools—at low or no cost,” he says. “We hope to develop a sustainability model to charge $2 a student to support existing research and development and save tremendous resources for school districts.”
Benjamin Silberglitt (Ph.D., ’03), director of software applications for TIES, a cooperative of school districts that assists them with educational technology, strongly backs Christ’s research.
“I think of Ted as the conscience of curriculum-based measurement,” says Silberglitt. “Ted is trying to say, ‘We have this great thing—let’s make sure it’s valid and that we’re very exacting about the assessment process.’ So when scores go up, we make sure we’re measuring students’ reading improvement and not just a difference in difficulty that introduced error into the measurement.”
The FAST tool will be especially useful for educators as they make decisions about students’ needs and education plans, giving them current data to evaluate.
FAST also allows schools to track students’ progress more frequently, even weekly when necessary, instead of relying on biannual tests.
“It works great as a screening measure and a progress-monitoring tool,” Silberglitt says. “It can tell after a few weeks that a strategy is working. And if not, the teacher can change those strategies and not have to wait until next year’s test to see that it didn’t work and we lost a whole year.”
With student outcomes on the line, teachers need access to accurate, time- sensitive, and easy-to-use tools to help all students achieve to their best potential.
“There is substantial evidence that assessment is a really good predictor of performance on statewide tests,” says Christ. “We can identify students who might be identified with a learning disability in the next few years and try to do something different for that student’s educational program so that prediction is broken or their rate of learning is accelerated.”
Learn more about Formative Assessment System for Teachers (FAST).
Read more about Ted Christ, Department of Educational Psychology.
Story by Suzy Frisch | Fall 2012