(UN)DOING: (post)digital // ABOUT // PT // EN

Intelligent natural language processing systems currently impact millions of people. The common proofreader or the automatic translator are some of the tools in which we daily delegate decisions about the correctness of our forms of expression. However, amongst their lines of code, a partial view of the world may be reflected. And the speed at which we develop and adopt these tools often eludes a reflection on the social prejudices that we may be incorporating into Artificial Intelligence systems..

man=y :: woman=x seeks to expose and deconstruct gender biases in natural language processing systems and highlight the cultural conventions that inform this phenomenon. The project approaches this topic from two perspectives, the social and the computational, presented in a printed publication and a web page. The publication frames the research on which the project is based, interconnecting texts on the social conventions that inform gender bias in Western society. Based on a study of gender bias in natural language processing, the web page encourages the user to explore a lexicon of analogies and gradually discover their level of bias.

In this manner, the project seeks to confront us with our own cultural conventions and prejudices, while promoting awareness of the ways in which Artificial Intelligence incorporates, and eventually perpetuates, gender stereotypes.