Gender & AI: Queering Technology

Do artificially intelligent systems reflect a certain gender, race or class? What and whose politics is currently being consolidated in algorithmic culture? 



As AI finds its way into the mundanity of everyday life, constantly scanning and categorising us to provide the highest level of comfort and efficiency, the systems that surround us are increasingly mediating our bodies, actions and behaviours. Search engine queries, autocomplete functions, auto-image tagging, personal aides, smart-speakers, wearables – all nicely displayed and packaged to ease our regular routines. Currently, however, we can witness a great deal of concealed racial and gender bias in the design of these systems and objects – be it through ‘personal assistants’ with female names and voices or soap dispensers that only works on white hands.

In this block-seminar they took an applied and interdisciplinary approach to explore forms of bias in the design of AI. They located real-world examples on the topic, engaged personally with the systems, and inspired by approaches from queer and feminist theory and technology – they prototyped forms of 'hacking back' (no prior experience with design or technology necessary).

 

Teachers:
Prof. Dr. des. Michelle Christensen and Prof. Dr. des. Florian Conradi

Technische Universität Berlin Fakultät I Geistes- und Bildungswissenschaften
Institut für Philosophie, Literatur-, Wissenschafts- und Technikgeschichte

Format:
Block Seminar
Freie Wahl

Language:
English