Guilhem Vellut_CC_BY_2.0_Header

Research

To remedy this problem, many argue that the introduction of fair, accountable, and transparent machine learning will thwart biased, racist, or sexist automated systems, or so the story goes.

 

Between 2017 and 2019 we completed foundational research on automated discrimination. The main findings were that many civil society representatives across Europe’s human rights prioritized the specific experiences of marginalized populations when examining or dealing with new, automated technologies. This approach tended to contrast the process-oriented perspective of tech-savvy groups, who shied from analysis of systematic forms of injustice. This inspired initiating the Justice, Equity and Technology Table. 

Between Antidiscrimination and Data 300x300

Understanding human rights discourse on automated discrimination in Europe

Seeta Peña Gangadharan & Jędrzej Niklas

This report examines how the topic of automated discrimination is making its way through European civil society organizations (CSOs) working in the field of human rights. By cataloguing practices and discourses, we can chart paths for future human rights efforts with regard to automated discrimination. 

Altogether we see three ways to support different CSOs’ engagement with the problem of automated discrimination:
• Resource digital rights or data privacy advocates to recognize antidiscrimination as a key concern for data protection and undertake automated discrimi- nation as a priority for their work;
• Support anti-discrimination groups and other groups focused on equity and justice in recognizing connec- tions between their core work and values and “high- tech” discrimination; 
•Acknowledge, cultivate, and support a flexible ap- proach to highlighting and problem solving for auto- mated discrimination.

Guilhem Vellut_CC_BY_2.0_300x300

Decentering technology in discourse on discrimination

Seeta Peña Gangadharan & Jędrzej Niklas

Algorithmic discrimination has become one of the critical points in the discussion about the consequences of an intensively datafied world. While many scholars address this problem from a purely techno-centric perspective, others try to raise broader social justice concerns. In this article, we join those voices and examine norms, values, and practices among European civil society organizations in relation to the topic of data and discrimination. Our goal is to decenter technology and bring nuance into the debate about its role and place in the production of social inequalities. To accomplish this, we rely on Nancy Fraser’s theory of abnormal justice which highlights interconnections between maldistribution of economic benefits, misrecognition of marginalized communities, and their misrepresentation in political processes.

Full article