Project

Data, Algorithms & Society

Updates
0 new
0
Recommendations
0 new
0
Followers
0 new
0
Reads
0 new
8

Project log

Paolo Costa
added a research item
Our relationship with technology reflects two general conditions: on one side, technological artifacts are not neutral. Their design contains user representations, and in particular those models that in feminist studies are known as gender scripts; on the other hand, the effect of these models is not mechanical and deterministic, because users continuously negotiate the meanings of technological artifacts. Each technology thus turns out to be the product of a cultural and political negotiation between designers and social groups. Focusing on artificial intelligence, we can see how the most widespread representation of these technologies does not correspond to their actual evolution, but rather tends to overestimate their capabilities. The result is a dissonance between AI performance and public expectations. This dissonance in turn predisposes to a dysfunctional relationship between the technologies themselves and the users. This dysfunctional relationship is evident in the context of machine learning systems, which incorporate gender biases of various kinds. To illustrate the problem with a few examples, I have chosen the domain of natural language processing systems, which have a strong impact on the lives of all of us. In the last part of this paper I mention some possible solutions to the problem.