Should the MCAS be making a comeback?

news
Author

Mia Mielke

Published

Feb 19, 2026

Importance of the MCAS

Are the last few supporters of the MCAS right? Should the test make a comeback?

The MCAS, or Massachusetts Comprehensive Assessment System, was a result of the 1993 Education Reform Law which created a formula to determine funding of schools across Massachusetts, with the goal of equalizing students’ access to an education as stated in a 2003 report by Umass Amherst School of Education. The focus on these assessments has led to students feeling their schools were focusing on standardized testing as their north star.

“I wasn’t allowed to miss MCAS prep to participate in other school activities that I wanted to… there was always this expectation to do well on the test, I just wanted to get it over with,” said Sophia Rubin, a student in eastern mass. about her experience with the MCAS in elementary and middle school.

Spending in so many schools across Massachusetts became tied to testing scores, but students benefit from spending not directly attached to their academics and test scores, as reported by Jie Chen and Thomas Ferguson in School District Performance Under the MCAS.

“The MCAS are not a really accurate indicator of who’s using school resources especially because social emotional learning is a huge factor in every student’s day to day … cutting those resources affects everyone detrimentally” - Kate Whiner, a student teacher in local Northampton schools, told us. She went on to say that “bad” test takers, who are often the students who need non-educational resources the most, are commonly discouraged from taking the tests over the fear that the school and district’s average test scores would be hurt.

Code
library(tidyverse)

df_pk6_2024 <- read_csv(here::here("posts","html","mm_first_story","School_Expenditures_by_Spending_Category_20260208.csv"))

names(df_pk6_2024)
 [1] "SY"             "DIST_CODE"      "DIST_NAME"      "ORG_CODE"      
 [5] "ORG_NAME"       "GRADES_SERVED"  "IND_CAT"        "IND_SUBCAT"    
 [9] "IND_VALUE"      "IND_VALUE_TYPE"
Code
nrow(df_pk6_2024)
[1] 16112

Citations: tidyr 1.3.2 (https://tidyr.tidyverse.org/articles/pivot.html) was used for information on pivoting a data set

Filtering the data to only the MCAS Performance and District Reported Expenses

Code
df_analysis_long <- df_pk6_2024 %>%
  filter(
    (IND_CAT == "Sub-Total A" & IND_SUBCAT == "District Non-Instructional Expenditures") |
    (IND_CAT == "Sub-Total B" & IND_SUBCAT == "District-Level Instructional Expenditures") |
    (IND_CAT == "Sub-Total C" & IND_SUBCAT == "School-Reported Instructional Expenditures") |
    (IND_CAT == "MCAS Performance" & IND_SUBCAT == "Math Grades 3-8 % Meets or Exceeds") |
    (IND_CAT == "Student Demographics" & IND_SUBCAT == "Student Headcount")
  )

Pivoting wider so that each school is one row

Code
analysis_df <- df_analysis_long %>%
  select(DIST_CODE, DIST_NAME, ORG_NAME, IND_SUBCAT, IND_VALUE) %>%
  pivot_wider(
    names_from = IND_SUBCAT,
    values_from = IND_VALUE
  )

Mutating the data so that it is all read as numbers

Code
analysis_df <- analysis_df %>%
  mutate(
    `District Non-Instructional Expenditures` = readr::parse_number(as.character(`District Non-Instructional Expenditures`)),
    `District-Level Instructional Expenditures` = readr::parse_number(as.character(`District-Level Instructional Expenditures`)),
    `School-Reported Instructional Expenditures` = readr::parse_number(as.character(`School-Reported Instructional Expenditures`)),
    `Math Grades 3-8 % Meets or Exceeds` = readr::parse_number(as.character(`Math Grades 3-8 % Meets or Exceeds`)
  )) %>%
  drop_na(
    `District Non-Instructional Expenditures`,
    `District-Level Instructional Expenditures`,
    `School-Reported Instructional Expenditures`,
    `Math Grades 3-8 % Meets or Exceeds`
  )

analysis_df <- analysis_df %>%
  mutate(
    `School-Reported Instructional Expenditures PC` = `School-Reported Instructional Expenditures`/`Student Headcount`,
    `District-Level Instructional Expenditures PC`=`District-Level Instructional Expenditures`/`Student Headcount`,
    `District Non-Instructional Expenditures PC`=`District Non-Instructional Expenditures`/`Student Headcount`
  )

analysis_df <- analysis_df %>%
  mutate(
    `L School-Reported Instructional Expenditures PC` = log(`School-Reported Instructional Expenditures PC` + 0.01),
    `L District-Level Instructional Expenditures PC`=log(`District-Level Instructional Expenditures PC` + 0.01),
    `L District Non-Instructional Expenditures PC`=log(`District Non-Instructional Expenditures PC` + 0.01))

Cleaning of the Model did not respond to simplue mutations likly becasue of missing data and thus that was dealt with seperatly.

Code
log_model <- lm(
  `Math Grades 3-8 % Meets or Exceeds` ~ 
    `L School-Reported Instructional Expenditures PC` +
    `L District-Level Instructional Expenditures PC` +
    `L District Non-Instructional Expenditures PC`,
  data = analysis_df
)

summary(log_model)

Call:
lm(formula = `Math Grades 3-8 % Meets or Exceeds` ~ `L School-Reported Instructional Expenditures PC` + 
    `L District-Level Instructional Expenditures PC` + `L District Non-Instructional Expenditures PC`, 
    data = analysis_df)

Residuals:
    Min      1Q  Median      3Q     Max 
-38.570 -13.895  -1.247  14.054  50.547 

Coefficients:
                                                  Estimate Std. Error t value
(Intercept)                                        51.3737     5.4513   9.424
`L School-Reported Instructional Expenditures PC`  -2.6216     1.8780  -1.396
`L District-Level Instructional Expenditures PC`   -1.9657     0.4864  -4.041
`L District Non-Instructional Expenditures PC`     -0.4103     1.7045  -0.241
                                                  Pr(>|t|)    
(Intercept)                                        < 2e-16 ***
`L School-Reported Instructional Expenditures PC`    0.164    
`L District-Level Instructional Expenditures PC`  6.68e-05 ***
`L District Non-Instructional Expenditures PC`       0.810    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 19.01 on 317 degrees of freedom
Multiple R-squared:  0.05924,   Adjusted R-squared:  0.05034 
F-statistic: 6.654 on 3 and 317 DF,  p-value: 0.000227
Code
model <- lm(
  `Math Grades 3-8 % Meets or Exceeds` ~ 
    `School-Reported Instructional Expenditures PC` +
    `District-Level Instructional Expenditures PC` +
    `District Non-Instructional Expenditures PC`,
  data = analysis_df
)

summary(model)

Call:
lm(formula = `Math Grades 3-8 % Meets or Exceeds` ~ `School-Reported Instructional Expenditures PC` + 
    `District-Level Instructional Expenditures PC` + `District Non-Instructional Expenditures PC`, 
    data = analysis_df)

Residuals:
    Min      1Q  Median      3Q     Max 
-39.271 -13.943  -1.296  14.982  47.793 

Coefficients:
                                                 Estimate Std. Error t value
(Intercept)                                     41.858254   1.465974  28.553
`School-Reported Instructional Expenditures PC`  0.002720   0.009389   0.290
`District-Level Instructional Expenditures PC`  -0.329142   0.099283  -3.315
`District Non-Instructional Expenditures PC`    -0.016760   0.031444  -0.533
                                                Pr(>|t|)    
(Intercept)                                      < 2e-16 ***
`School-Reported Instructional Expenditures PC`  0.77223    
`District-Level Instructional Expenditures PC`   0.00102 ** 
`District Non-Instructional Expenditures PC`     0.59441    
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 19.03 on 317 degrees of freedom
Multiple R-squared:  0.05727,   Adjusted R-squared:  0.04835 
F-statistic: 6.419 on 3 and 317 DF,  p-value: 0.0003118

Interpretation:

Each dollar spent in school-level instructional spending per capita is associated with a 0.09476 percentage increase in the percentage of students in grades 3 to 8 that met or exceeds expectations on the MCAS Math examination, holding all other spending constant.

Each dollar spent in distinct level instructional expenditures per capita is associated with a 3.56722 percentage decrease in the percentage of students in grades 3 to 8 that met or exceeds expectations on the MCAS Math examinations, holding all other spending constant.

Each dollar spent in District Non-Instructional Expenditures per capita is associated with a 1.86728 percentage decrease in the percentage of students in grades 3 to 8 that met or exceeded expectations on the MCAS Math examinations, holding all other spending constant.

Questions of wither or not the pass rate is impacted by retakes (thus making it lower)

Are the last few supporters of the MCAS right? Should the test make a comeback?

The MCAS, or Massachusetts Comprehensive Assessment System, was a result of the 1993 Education Reform Law which created a formula to determine funding of schools across Massachusetts, with the goal of equalizing students’ access to an education as stated in a 2003 report by Umass Amherst School of Education. The focus on these assessments has led to students feeling their schools were focusing on standardized testing as their north star.

“I wasn’t allowed to miss MCAS prep to participate in other school activities that I wanted to… there was always this expectation to do well on the test, I just wanted to get it over with,” said Sophia Rubin, a student in eastern mass. about her experience with the MCAS in elementary and middle school.

Spending in so many schools across Massachusetts became tied to testing scores, but students benefit from spending not directly attached to their academics and test scores, as reported by Jie Chen and Thomas Ferguson in School District Performance Under the MCAS.

“The MCAS are not a really accurate indicator of who’s using school resources especially because social emotional learning is a huge factor in every student’s day to day … cutting those resources affects everyone detrimentally,” said Kate Whiner, a student teacher in local Northampton schools. She added that “bad” test takers, who are often the students who need non-educational resources the most, are commonly discouraged from taking the tests over the fear that the school and district’s average test scores would be hurt.

Plot

Code
library(ggplot2)

# Scatterplot: School-Level Instructional Spending
ggplot(analysis_df, aes(
  x = `School-Reported Instructional Expenditures PC`,
  y = `Math Grades 3-8 % Meets or Exceeds`
)) +
  geom_point(color = "#1f77b4", size = 2) +      # blue points
  geom_smooth(method = "lm", se = FALSE, color = "#ff7f0e") + # orange line
  labs(
    x = "School-Level Instructional Spending Per Capita ($)",
    y = "MCAS Math % Meets/Exceeds",
    title = "MCAS Math vs School-Level Spending"
  ) +
  theme_minimal() +
  theme(plot.title = element_text(color = "#2ca02c", size = 14, face = "bold"))

Code
# Scatterplot: District-Level Instructional Spending
ggplot(analysis_df, aes(
  x = `District-Level Instructional Expenditures PC`,
  y = `Math Grades 3-8 % Meets or Exceeds`
)) +
  geom_point(color = "#d62728", size = 2) +      # red points
  geom_smooth(method = "lm", se = FALSE, color = "#9467bd") + # purple line
  labs(
    x = "District-Level Instructional Spending Per Capita ($)",
    y = "MCAS Math % Meets/Exceeds",
    title = "MCAS Math vs District-Level Spending"
  ) +
  theme_minimal() +
  theme(plot.title = element_text(color = "#17becf", size = 14, face = "bold"))

Code
# Scatterplot: District Non-Instructional Spending
ggplot(analysis_df, aes(
  x = `District Non-Instructional Expenditures PC`,
  y = `Math Grades 3-8 % Meets or Exceeds`
)) +
  geom_point(color = "#ff9896", size = 2) +      # pink points
  geom_smooth(method = "lm", se = FALSE, color = "#2ca02c") + # green line
  labs(
    x = "District Non-Instructional Spending  Per Capita ($)",
    y = "MCAS Math % Meets/Exceeds",
    title = "MCAS Math vs Non-Instructional Spending"
  ) +
  theme_minimal() +
  theme(plot.title = element_text(color = "#9467bd", size = 14, face = "bold"))

An examination of school spending divided in district educational, district non-educational and school educational along with MCAS proficiency for middle schoolers paints a picture of this non-instruction spending seemingly having a nearly nonexistent effect on students, when in reality this can be some of the most important assets to a student’s development. While success on this test does not directly equate to additional funding the state does use the average score as a metric for how well a school is spending its funding. If “non-education spending” has the least impact on student performance on the MCAS , in such a strictly numbers focused world it is most likely to be cut.

There is still value in creating a standardized testing system to prevent students from graduating without retaining valuable information from their schooling, but the voters of Massachusetts have spoken. The MCAS is no more. Instead the state is moving towards specific course requirements and possible capstone projects, but some of the details are still in the works.

“[Students] can focus on what they’re learning in the classroom and bring it to their community through their capstone project. They can apply what they’re learning in real life, you need that extra step in the real world,” said Rubin.

Citations (MLA): Berger, Joseph, et al. An Overview of Massachusetts Education Reform Impact of the Education Reform Act on Equalizing Education Finance Linda Driscoll Bridges and Barriers: Equity Issues in High Stakes Testing Setting Passing Scores on Tests Ronald K. Hambleton Are MCAS Scores Comparable over Time? VRROOM: The Virtual Reference Room Managing to Lead in the Decade of Education Reform Capacity to Implement Education Reform Conclusions: The Impact of Education Reform after Ten Years.

Chen, Jie and Ferguson, Thomas (2002) “School District Performance Under the MCAS,” New England Journal of Public Policy: Vol. 17: Iss. 2, Article 7.