Texas A&M drops ‘race’ from student risk algorithm following Markup investigation

[ad_1]

A large public university has suspended its use of risk scores following a markup investigation that found several universities used race as a predictor of student success. Our investigation also found that Navigate software, created by EAB and used by more than 500 schools across the country, disproportionately labeled black and other minority students as “high risk” – according to experts in practice. , black children end up dismissing math and science into “easier” majors.

Following our report, Texas A&M University announced that it would stop including these risk scores on advisor scorecards and asked EAB to create new models that do not include race as a variable.

“We are committed to ensuring the success of all Texas A&M students,” wrote Tim Scott, Texas A&M vice director for academic affairs and student success, in an email to The Markup. “All decisions made regarding the success of our students will be made in a way that is fair and equitable for all students.”

The response from other schools has been mixed.

Maryclare Griffin, professor of statistics at the University of Massachusetts at Amherst, another school featured in the article, said her institution appears to have removed the ability to display student risk scores for certain Navigate users. Another teacher at the school told The Markup that he was still able to see students’ risk scores.

UMass Amherst spokeswoman Mary Dettloff did not confirm whether the school made any changes to its Navigate system and declined to answer further questions for this story.

The University of Houston, one of four schools from which The Markup obtained data showing racial disparities in risk scores, has made no changes to its use of EAB algorithms, said Shawn Lindsey, a spokesperson for the university.

The other schools mentioned in the original story – the University of Wisconsin – Milwaukee, South Dakota State University, Texas Tech University, and Kansas State University – did not respond to the questions of this story.

The Markup obtained data from public universities showing that algorithms embedded in educational research firm EAB’s Navigate software gave black students high risk scores ranging from double to quadruple the rate of their white peers. Risk scores claim to predict the likelihood of a student dropping out of school if that student stays in their chosen major.

In almost all of the schools examined by the markup, the ABE algorithms used by the schools explicitly considered the race of students in their predictive models. And in several cases, schools have used race as a “high-impact predictor” of achievement, meaning it was one of the variables with the most influence on student risk scores.

“EAB is deeply committed to equity and student success. Our partner schools have divergent views on the value of including demographics in their risk models. That is why we urge our partner institutions to proactively examine the use of demographics, ”EAB spokesperson John Michaels wrote in an email to The Markup. “Our goal has always been to give schools a clear understanding of the data that powers their personalized models. We want to make sure that each institution can use predictive analytics and the larger platform as intended, in order to provide the best support to their students. “

EAB marketed its counseling software as a tool for cash-strapped universities to better direct their resources to the students who need help the most and, in the process, to boost retention and avoid the additional cost of recruiting students to replace those who drop out.

But in the schools reviewed by The Markup, we found that teachers and counselors who had access to EAB student risk scores were rarely, if ever, informed about how scores were calculated or trained. on how to interpret and use them. And in several cases, including at Texas A&M University, administrators were unaware that race was being used as a variable.

Instead, the software gave counselors a first impression of whether a student was at high, moderate, or low risk of dropping out in their major in their specialty, and then, through a feature called Major Explorer, they were given a shown how this The student’s risk could decrease if the student moved to a different, “less risky” field of study.

Experts said the design characteristic, coupled with racial disparities in risk scores, was likely to perpetuate historic racism in higher education and prompt students of color, especially black students, to leave schools. science, mathematics and engineering programs.

Iris Palmer, Senior Advisor for Higher Education and Workforce Policy at New America, studied predictive analytics systems used by universities to boost retention and wrote a guide for schools to follow. when considering implementing such systems.

“I don’t think explicitly removing race from the algorithm fixes the problem or necessarily improves the situation,” she said. “Algorithms can predict race based on all kinds of other things that go into the algorithm,” such as combinations of data such as zip code, high school name, and family income.

There is potential value in using predictive analytics to identify students who need support the most, Palmer said, if schools actually train staff on how algorithms work and if the software explains, in a way. concise and understandable, what factors lead to each. the student is assigned a specific risk score. “And that’s a big if.”

Schools “need to do their due diligence around the disparate impact and why you are seeing the disparate impact on your campus,” she said. If the schools had done so before signing multi-year contracts with EAB, “they would not have been caught off guard”.

This article by Todd Feathers was originally published on The Markup and has been republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *