re:publica 2017 speaker Caroline Sinders doesn’t limit herself to digital ethnography and design. Her current focus lies on how we can curb online harassment through the use of machine learning systems, and developing a more collaborative approach to designing the functionalities of platforms.
Caroline Sinders is actually from New Orleans, but according to her, she mainly lives life online. Filters are a reoccurring topic in her life and her research: as a photographer she observed the significant relevance shifts within communities following the floods caused by Hurricane Katrina and consequently refocused her own field of interest – onto digital ethnography.
She did her Masters in a program for interactive telecommunications, where she concentrated on “human-centered interfaces”, storytelling and social media theory under the supervision of Clay Shirky, among others. Afterwards, as part of the design team for IBM’s artificial intelligence platform WATSON, she researched trainings programs for human-machine communication. These not only determined the variants in the artificial intelligence’s interaction, they simultaneously found a large resonance within the test persons who reacted to the technology in predictable ways. This had Caroline questioning if decontextualized language in data sets did not, in fact, constitute a civilizational step backwards. Moving forward one of her key questions became how embedded systems influenced one’s own behavior.
In part due to her proximity to the indie game sector, she began to spend her free time researching and shedding light on the forms of online communication and harassment surrounding the #Gamergate controversy. In doing so, she discovered that although every platform requires its own specific form of communication, trolls themselves represent a particularly adaptive species that succeeds in imposing its mindset on their environment.
Amorphous groups dump their negative activism into the online world, filling it up until it spills out into the real world. Caroline recognized that the future of politics could become hyper-personal. With this realization she put forward concepts and challenges that have recently become realities in elections around the world.
During #Gamergate, trolls expanded Twitter into a social news aggregator. They used hashtags as a foundation for forum-like structures: when a person posts in a forum, they expect an answer. This results in a running conversation. Twitter, on the other hand, is more akin to sending a text message out into the ether in the hopes that it might reach someone. The “Gamergaters” used Twitter more like the 4Chan platform, where anybody can post and answer on a kind of digital blackboard. They would pick out specific topics from the ocean of tweets which they felt provoked by, such as Feminist concerns, and then added their own hashtags to them. Posts and messages with hashtags can potentially be seen by far more Twitter users than just one’s own followers – a public, so to speak. This way, Gamergaters could initiate Twitter debates with 30 like-minded people against the original poster, and then bombard them verbally. This ended up resembling a direct attack more than it did a mutual exchange.
Sinders used field tests in an attempt to work out a technological solution for the problem from the designer’s perspective: with filters. The more her approach was discussed, the more she made herself a target – until, finally, even her mother’s house was swatted.
Her mother’s first reaction was to ask what Caroline had tweeted that had made the trolls so angry. But Sinders soon realized that she wasn’t the one who had made a mistake: the problem was a deficiency in the system and not a vulnerability of the users. She would therefore have to search for concepts that would allow for communication that was both safe and public.
Up until recently, she had kept away from the Californian sun and “Tech Solutionism”, meaning the belief that technology has a fitting solution for all the world’s problems. She is currently a fellow at BuzzFeed and the Eyebeam OpenLab and splits her time between New York and San Francisco. She is working on a prototype for a machine learning system based on her fundamental research and is attempting to integrate emotions into AI processes, so as to actively reduce harassment. Something we are in dire need of! Which is why we are especially looking forward to Caroline Sinders’ talk at the re:publica 2017 regarding the possibilities of creating emotional data corpora for use on social media platforms to create transparent, participatory systems.
Image credit: Caroline Sinders