This is the first of three articles on U.S. education. Here I’ll focus on the domestic situation using the National Assessment of Educational Progress (NAEP). Part 2 focuses on California. In part 3, I’ll review how the U.S. stacks up against the rest of the world. I’ll rely on the Trends in International Mathematics and Science Study (TIMSS). They administer global standardized tests every two or three (or maybe four) years. This is my report on the state of U.S. education in 2019; comparing the states.
There are two main parts to this article. The first is to look at the most recent NAEP results for three subjects: mathematics (2019), reading (2019), and science (2015). All results are for eighth grade students. Reporting on high school students tends to be slow and some states don’t report at all.
The second part is a look at the relationship between NAEP scores and spending per student. Using an obscure measure of the price level in each state, I’ll calculate real spending per student. The short version: there is no statistically significant relationship between either measure of spending and the math or science scores. Interestingly, reading scores seem to show a significant positive response to increased real spending (adjusted for cost of living differences among the states).
The State of Education
The data I’ll be using is from the National Assessment of Educational Progress (NAEP, often called the Nation’s Report Card, administered every two, three, or four years). The most recent survey was done in 2019. (I hate to think how much worse U.S. COVID policies have made this situation.) I used data from the 8th grade survey because the 2019 data for 12th grade is not available. Also, when I last used this data, many states did not report 12th grade data. My focus is on performance by state. California results are usually highlighted in red. The national average is highlighted in lime green. I plan to write separate articles on the California results and the international comparison. I will link them when (if?) I finish them.
The NAEP evaluations are based on standardized tests. I’ll look at the results for mathematics, reading, and science. Those are the only categories that have recent data by stat6e. The easy preview is that things have not gotten better (to put it mildly). A second note is that two entities regularly top the rankings. They are the state of Massachusetts and the cryptically named “DoDEA.” That’s the Department of Defense Education Activity. Follow the link to learn more about this agency.
Massachusetts, of course, is a university state with many, many college professors. MIT, Harvard, Boston University, Boston College, Smith College, Mt. Holyoke College, and Amherst are all located in the state’s boundaries. There are a host of lesser-known institutions, including Wheeling, Emerson, Northeastern, and many others.
My data and methods are transparent. Click here to download the Excel workbook including raw data, a working version of the raw data, the graphs, spending per student, the price index for each state, and six regressions to test the relationships statistically.
Some technical details: I have used data from all schools surveyed, including public and private. A quick look at math scores in public schools only did not reveal much difference. The math and reading tests have maximum scores of 500. For science, the maximum is 300. Finally, I have rounded average scores to integer values. The Excel workbook has the complete data to 12 decimal places. This means that when states are tied for a specific ranking, chances are excellent that the tie would vanish if I rounded to, say, three decimal places. That would make the graphs unwieldy. If anyone undertakes this experiment, please let me know. You can gain fame by writing an article for this blog.
Massachusetts and DoDEA are one-two in the rankings with average scores of 294 and 292. Technically, DoDEA is tied with New Jersey for second place. California is in a four-way tie for 40th place with Oklahoma, South Carolina, and Rhode Island. Those four states had average scores of 276. The national average is 282. Here’s the complete graph:
DoDEA and Massachusetts are again one-two but have changed places. Average scores are 280 and 273. California is in a four-way tie for 39th place with Arizona, South Carolina, and Arkansas. Those four states had average scores of 259. The national average is 263. Here’s the complete graph:
The folks at NAEP seem to be a little slow getting the 2019 results compiled. To get state level data, I had to settle for 2015 data. Unlike the math and reading scores (500 maximum), the science test has a maximum of 300 points.
Utah and DoDEA are tied for first place with average scores of 166. Massachusetts slips to a tie for fifth place at 162. California is tied with New Mexico for 45th place with an average score is 143. The national average is 154. Here’s the complete graph.
Performance and Spending
Teachers’ Unions often complain about low spending on public schools. By implication, they are saying that spending more would improve student results. The U.S. Census Bureau compiles this data. Let’s take a look at per-student spending by state.
As we all know, the cost of living varies wildly among the states. After considerable search (and with the help of Maurine Haver of Haver Analytics) I discovered the Bureau of Economic Analysis publishes price deflators for consumption for each of the fifty states plus the District of Columbia. Using those deflators I calculated real spending per student.
A Little Statistics
Any economist worthy of the title will be irresistible drawn to doing a couple of statistical tests to see if there’s a relationship between spending and performance. The null hypothesis is that the true value of the coefficient of spending is zero. If the absolute value of the t-statistic is 2.0 or more, we can reject that null hypothesis. Here are the six regressions in question.
Now failing to reject a null hypothesis does not mean we accept the alternative hypothesis (that the coefficient, in fact, equals zero). But we can reject the idea that, at least for scores in 2019 8th grade math and 2015 8th grade science, there is a statistically significant relationship between spending per student and performance.
What about reading? The coefficient of real spending per student is +0.00049159. The sign of the coefficient is positive, meaning more real spending per student tends to increase reading scores. The t-statistic is 2.335, giving a p-value of 0.0237. This is significant at the five percent level. It appears that for every additional thousand dollars per student spent on education, reading scores increase by about 0.49 points. At this point, policy makers get to step in to determine whether this improvement is worth the cost. That’s above my pay grade. I would, however, suggest trying to replicate these results using data from the 2012 and 2015 NAEP results.
There is a technical issue that needs mentioning. The 2019 data for math and reading covered all 50 states, the District of Columbia, and the Department of Defense Education Activity. I excluded DoDEA for obvious reasons. But the 2015 data for science did not include either DC or DoDEA. There were also four states omitted: Alaska, Colorado, Louisiana, and Pennsylvania. If you’re curious about the reason, please consult the NAEP website.
Wasn’t this fun? Consider it a tour of economic methodology. According to my lovely wife, the absence of a relationship between spending and performance is pretty well accepted today. That suggests doing more research on my reading results. I leave it to others to expand the research on reading scores.