Algorithms and discriminations



The social network Twitter announced Friday, July 30 that it would grant rewards to users and researchers discovering possible biases, sexist or racist for example, in the algorithms running on its platform. Through this competition, modeled on the competitions offered by certain Internet sites to detect security breaches, the Californian company wants to make its operations more ethical.

Twitter thus recognizes that its programs can reproduce stereotypes and amplify prejudices and discrimination. In fact, no algorithm is really neutral. All of them have biases insofar as they are always the reflection – through their configuration, their operating criteria, the data that feeds their learning process – of a system of values ​​and social choices. Bias that the community logic of social networks helps to reinforce.

Ethical questioning

That Twitter seizes on its own account the ethical questioning raised by its tools invites us to become aware of the place that algorithmic technologies and learning systems now have in the functioning of our society. They are present in our daily life today, each time we use an application: in accessing information or knowledge, preparing our trips, purchasing goods and services, carrying out procedures. administrative… This requires us to exercise our vigilance on the operation of these digital black boxes, and in the right to demand greater transparency from them. Without any reward, if not the satisfaction of fighting against the automation of prejudices and discrimination.

.

Leave a Reply

Your email address will not be published. Required fields are marked *