Mitigating Racial Biases in Toxic Language Detection with an Equity-Based Ensemble Framework

Matan Halevy, Camille Harris, Amy Bruckman, Diyi Yang, Ayanna Howard. Mitigating Racial Biases in Toxic Language Detection with an Equity-Based Ensemble Framework. In EAAMO 2021: ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization, Virtual Event, USA, October 5 - 9, 2021. ACM, 2021. [doi]

Abstract

Abstract is missing.