MCAS: It's a great test, but...
A great diagnostic test shouldn't be used as a barrier to graduation.
If you’re driving down the Massachusetts Turnpike, you may see a tractor in the median with a mowing attachment cutting the grass. This lawn tractor is well-designed for its task. While you could hop on and travel west until you reach Seattle, it’s not a very good choice. There are other four-wheeled vehicles better designed for a cross-country trip.
The same is true for the MCAS. Using the test as a graduation requirement is the statistical equivalent of driving a lawn tractor from Boston to Seattle.
The MCAS was built as a diagnostic, and it serves that purpose well. When the full data is in the hands of skilled educators, it can effectively point to places where curriculum or instruction is not aligned to the standards. You can see where classrooms or schools have lower writing scores compared to multiple choice questions. You can see where classrooms aren’t focusing on informational (non-fiction) test when there is a significant difference from literature (fiction) scores. You can see if your eight grade is spending too much emphasis on life science and not enough on physical science.
The problem with MCAS comes when you try to use it for another purpose. The strengths that make the MCAS an excellent diagnostic make it a poor choice for a graduation requirement. The problem is that scores generated for individual students are filled with statistical error. The concept of statistical error is based on the existence of an underlying true score, but the measurement tool is not precise, so it generates variance in the test score. Just as a political poll becomes a better indicator as more people are added to the sample, the value of MCAS comes from its administration to almost all students at a given grade level.
There are multiple sources for statistical error when using MCAS to evaluate individual students. Consider two students earning the same high grade in the same math class in the same school. Student 1 had three hours sleep because the beloved family dog died at midnight, and the student had to walk a half mile to school in the rain. Student 2 went to bed happy, got eight hours of sleep, a good breakfast, and a ride to school. Students, and the conditions in which they take the test, produce statistical error with widely divergent scores. When you look at the 70,155 students who took the tenth-grade math test, individual differences tend to cancel each other out and the average score converges on an accurate measure of statewide achievement.
In addition, the selection of questions brings error into the testing process. There are hundreds if not thousands of good questions that can be asked about geometry. In 2023, the tenth grade MCAS asked only 15 questions about geometry. Geometry questions represented 21 of 60 possible raw score points on the entire test, ten of these questions are multiple-choice. Again, a small sample of questions across a broad population of students paints a good picture of the population but isn’t an accurate picture of an individual student’s competency. Lucky students can gain points by guessing among multiple-choice items. Second-language learners can get tangled up with the wording of a complicated question. Whole classrooms can perform extremely well if a concept is taught the week before the test, while the classroom next door can perform poorly if that lesson is taught the day after the test.
The test is very good at discovering patterns and trends among students across the state. It’s unfair and unreasonable to set a cut score and deny a high school diploma based on a test that was not designed to evaluate individual outcomes.
Supporters of the MCAS graduation requirement often argue that eliminating the testing requirement will lower standards. The problem with this assertion is the MCAS, or any other measurement tool, is just an indicator. The actual standards contain extensive descriptions for student achievement over four years of high school.
Massachusetts graduation requirements exceed the tested standards. Arlington High School is not unusual including three years of mathematics, three years of science, three years of history and social science, and four years of English among its graduation requirements.
The Department of Elementary and Secondary Education (DESE) collects extensive data about public school students. DESE connects to the student information systems of every public school district and charter school and collects attendance, grades, course descriptions, all tied to the teacher of record for each course. The state has this data for every public school student, tied to a state-assigned student number, and they track students as they move between schools and districts. The state can tie this data to every MCAS question answered by a student over an entire academic career.
DESE has an excellent Office of Planning and Research, and they employ some very talented statisticians. DESE has the capability to look at data patterns that can place the burden of accountability for meeting standards on schools instead of individual students.
The state has built a sophisticated, high-performance system of testing and data collection capable of finding patterns where MCAS scores don’t align to graduation standards and reported performance measures. With such a high-performance vehicle, we don’t need the ride the lawn tractor to reach our destination.