![]() “You can easily find situations where there are two students who have similar low GPAs and different risk scores, and there’s no obvious explanation for why that’s the case,” said Maryclare Griffin, a statistics professor at UMass Amherst. The scores, which are one of the first things a professor or administrator may see when pulling up a list of students, can leave advisers with an immediate and potentially life-changing impression of students and their prospects within a given major. But now these gatekeepers are armed with ‘complex’ math.” “College advisors tell Black, Latinx, and indigenous students not to aim for certain majors. “This opens the door to even more educational steering,” said Ruha Benjamin, a professor of African American studies at Princeton and author of “Race After Technology,” after reviewing EAB’s documents. Put another way, Black students made up less than 5 percent of UMass Amherst’s undergraduate student body, but they accounted for more than 14 percent of students deemed high risk for the fall 2020 semester. Sources: Texas A&M, University of Massachusetts Amherst, and University of Wisconsin–Milwaukee The algorithms labeled Asian students high risk at similar or lower rates than White students. Latinx students were also assigned high risk scores at substantially higher rates than their White peers at the schools The Markup examined, although not to the same degree as Black students. And at Texas A&M University, they label Black women high risk at 2.4 times the rate of White women, and Black men at 2.3 times the rate of White men. At the University of Wisconsin–Milwaukee, the algorithms label Black women high risk at 2.2 times the rate of White women, and Black men at 2.9 times the rate of White men. We found large disparities in how the software treats students of different races, and the disparity is particularly stark for Black students, who were deemed high risk at as much as quadruple the rate of their White peers.Īt the University of Massachusetts Amherst, for example, Black women are 2.8 times as likely to be labeled high risk as White women, and Black men are 3.9 times as likely to be labeled high risk as White men. In addition to documents on how the models work, the Markup obtained aggregate student risk data from four large public universities-the University of Massachusetts Amherst, the University of Wisconsin–Milwaukee, the University of Houston, and Texas A&M University-for the fall 2020 semester. More than 500 universities across the country use Navigate’s “risk” algorithms to evaluate their students. At least four out of seven schools from which The Markup obtained such documents incorporate race as a predictor, and two of those describe race as a “high impact predictor.” Two schools did not disclose the variables fed into their models. ![]() The documents, called “predictive model reports,” describe how each university’s risk algorithm is tailored to fit the needs of its population. Documents obtained by The Markup through public records requests show that some schools are using education research company EAB’s Navigate advising software to incorporate students’ race as what the company calls a “high-impact predictor” of student success-a practice experts worry could be pushing Black and other minority students into “easier” classes and majors. Major universities are using their students’ race, among other variables, to predict how likely they are to drop out of school.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |