#rp18 Speaker Safiya Umoja Noble: Why Autocomplete Is Only Funny for Those Who Can Afford It

#rp18 speaker Safiya Umoja Noble

#rp18 speaker Safiya Umoja Noble; credit: Safiya Noble

How do people connect with, and explore, their surroundings and what problems arise when doing so – especially when they’re using technology as a tool for discovery? That, in a highly condensed form, is precisely what Safiya Umoja Noble’s research focuses on at the USC Annenberg School for Communication and Journalism in California.

Which leads us straight to the key problem: condensed representations are not a must – not even in the age of digitally shortened attention spans. Placed against an information science backdrop, Safiya Noble researches false representations of people, especially black women, and concepts such as “beauty” through the lens of search engine algorithms and the datasets they operate on. Commercial platforms have become the primary point of access for knowledge in the current information landscape. The only problem is that the technology employed there is about as neutral as a butterfly knife. Algorithms determine what search engines can find and read. Just as Frank Pasquale, Kate Crawford or Caroline Sinders made clear on the re:publica main stage over the last years: these platforms mainly read out our prejudices from our data traces – and continue their dissemination without question.

For her newly published book “Algorithms of Oppression”, Safiya Noble researched the wide range of effects that this incurs. Among them: What happens when we search for the catchwords "black girls"? "Big booty" and other sexually explicit terms regularly appear as top search terms – this is even the case when SEO orientates itself towards us and our more open-minded values through earlier searches. If, on the other hand, we type in "white woman", the results are far less sexualised and quite obviously operate on very different conceptual patterns of what a “woman” is. Not all humans are equal in the eyes of the machine: one can technologically amplify what nevertheless remains, and has always been, a faulty assumption. Through offering us supplementary proposals via the autocomplete search engine function, the tool can have a lasting effect on our conceptual view on the world. Something that presents itself as an editor’s snappy format concept, and comes across as funny (see the series of "Autosuggest Interviews"), can easily develop to the very real detriment of groups who are already strongly affected by discrimination.

Google "why black women are so sassy" or "why black women are so angry" and the search engine will suggest porn sites and un-moderated discussions. They give insight into artificial intelligence’s disturbing understanding of (black) womanhood in modern society as produced from our small data repositories. Through an analysis of textual and media searches, as well as extensive research on paid online advertising, Safiya Noble exposes a culture of racism and sexism in the way discoverability is created online. Noble delivers an impressive overview of the effects that algorithms have on our understanding of racial and gender identities. She thereby raises crucial issues concerning the performance, efficiency and control of algorithms.

In a time of clearly heightened tensions and right-wing tendencies that exacerbate societal divisions, distorted depictions of ethnic stereotypes present a very real problem. You don’t have to be a linguist to understand that our conceptions of the world also define our daily interactions with each other. For example, scientific data shows that the "Russian" disinformation campaigns in the US deliberately built on ethnic stereotypes (see here).

Noble’s work aims to improve the structural conditions, so as to renegotiate the role of technology in connection with fundamental civil and human rights – as well as highlighting how digital media platforms intervene in the very nature of human relationships and shift their focus. She argues that deep learning and artificial intelligence will be the decisive topics of the coming years. In the face of such a wide array of theoretical genesis stories, her approach for this phenomenon takes a decidedly feminist, historical and political-economic perspective on platforms and software.

Noble was previously assistant professor in the Department of Information Studies at UCLA, where she held appointments in the Departments of African American Studies and Gender Studies. She currently serves as an Associate Editor for the Journal of Critical Library and Information Studies, and is the co-editor of two books: The Intersectional Internet: Race, Sex, Culture and Class Online (Peter Lang, Digital Formations, 2016), and Emotions, Technology & Design (Elsevier, 2015). She is a partner in the Stratelligence network, a firm that specialises in research on information and data science challenges, and is a co-founder of the Information Ethics & Equity Institute (IEEI), which consults organisations committed to transforming their information management practices toward more just, ethical, and (gender) equitable outcomes.

We look forward to welcoming Safiya Umoja Noble at #rp18!

Website: safiyaunoble.com