Frederica Freyberg:
Turning to education, a sophisticated algorithm called the Dropout Early Warning System or DEWS used by the Wisconsin Department of Public Instruction is one of only a few like it in the country. The system tries to predict whether a student exiting middle school is likely to graduate from high school. This is according to a report from “The Markup,” a publication that looks at the use of big tech. The predictive model uses a number of data points, including test scores, disciplinary records and family income. But a unique and controversial data point used by the DEW system is race. DPI says, “DEWS uses what already has happened in the past to predict what will happen in the future, all in order to change course in the present and help ensure students identified as at-risk of not graduating on time do, in fact, graduate on time by directing resources, supports and interventions toward students that may need them.” For more, we turn to the report’s author Todd Feathers and thanks for being here.
Todd Feathers:
Thank you so much for having me.
Frederica Freyberg:
You seem to have uncovered a prediction system for high school graduation that most of us have never heard of. What is it and how does it work?
Todd Feathers:
As you described very well, it is a system built in 2012 that uses machine learning algorithms to predict whether students will graduate on time. The system generates a prediction for every student in 6th through 9th grades in the state. It also labels them either high, moderate or low risk of dropping out. This is supposed to help schools direct resources, support to the students who need it most but what we found system is the system is wrong nearly three quarters of the time that it predicts a student won’t graduate. It’s wrong more often about Black and Hispanic students not graduating than it is about white students. On top of that, at the same time our reporting is going on, some academic researchers based out of the University of California-Berkeley conducted a study, the largest of its kind on DEWS that found that the system has not achieved its primary goal of improving graduation outcomes for the students it labels as high risk.
Frederica Freyberg:
So this question, if it is wrong according to your reporting, three quarters of the time, why is that?
Todd Feathers:
So by some standards, DEWS is very accurate. It is correct 97% of the time that a student will graduate. What DPI did when they were designing the algorithm is they calibrated it to accept a greater rate of false alarms is a good way of thinking about it. Predictions that a student who does ultimately graduate won’ graduate. So they are saying, we are willing to label a lot of students likely non-graduates in order to make sure that we capture all of the ones who ultimately will drop out.
Frederica Freyberg:
Could it be an error as a result of resources and inputs being given to the students that might be predicted not to graduate?
Todd Feathers:
That is one of the things the University of California-Berkeley researchers tested for. If the system was working as intended, that resources were going to students labeled high risk, you would expect to see ten years of data for improved graduation rates for those students but the data showed there was no impact on graduation rates from being identified by this system.
Frederica Freyberg:
Turning to another data point that’s loaded into this algorithm, that is race. Something you call in your article “a racially inequitable algorithm built by the state of Wisconsin.” What did you learn about why race is used as a predictor?
Todd Feathers:
I think it’s important to remember this algorithm was built, designed back in 2012. We have come a long way since then as far as how we understand algorithms and how we understand algorithm bias and the way that data points like race factor into these predictions. I think one thing that’s happened here is this algorithm hasn’t really changed much since it was initially designed. Back at that time, even still currently, some researchers in this field of early warning systems, will say using data points like race improves the accuracy of these algorithms. Makes them better at predicting who will and won’t graduate high school. You can see why that might be the case and in Wisconsin, there are some pretty stark racial disparities of graduation rates. White students graduate at much higher rates than Black students but a growing body of research and academic arguments say a small boost in accuracy you get is not worth encoding what often times is systemic and historical biases.
Frederica Freyberg:
I want to get to something the DPI told us, you quoted a DPI spokesperson in your article who said the education system is systematically racist but she told us, “When it comes to labelling students, I agree that we should be concerned about the impact these words have on our learners. And yet the intention of the label is to provide access to the interventions these students need. I wish we didn’t need a label to open those doors but it is an attempt to provide resources to the learners who need them most and to do that, schools need to be able to identify them.” How in your reporting did the students you interviewed feel about being labeled as high-risk and furthermore, potentially being one of those where race was put into this algorithm?
Todd Feathers:
You know, the first thing is none of the students we spoke to knew this algorithm even exists or their school was potentially looking at this information and making decisions about how they treat them which I think shocked a lot of the students. As I said, it shocked me when I first heard about it. But the Black and Hispanic students we spoke to described for us they often felt like they were part of secondary school systems within their own schools. They were treated differently than white students. They feared having these behind-the-scenes labels would perpetuate kind of the subconscious or conscious biases that educators have about them. You can imagine, especially for incoming 9th graders who are coming into a new building, none of the adults know them. The first thing you see about a student is a label: high, moderate, low risk. It’s hard not to draw certain conclusions.
Frederica Freyberg:
All right. Todd Feathers, really complicated and interesting reporting. Thanks very much.
Todd Feathers:
Thank you so much again.
Follow Us