otso Annotator is live
Trial our cloud-based annotator built for Machine Learning Engineers and Data Scientists now. No credit card required.
Learn More
Solutions
Customer Insights
Industries
Products
Annotator
Projects
Project Feelings
Blog
Webinars
Book a Demo
Log in

otso @ AoIR19: Trust in the System

Author: Dr. Andrew Quodling (Platform Success and Support Lead, otso)

otso and Max Kelsen recently had the opportunity to attend the Association of Internet Researchers 2019 (AoIR19) conference, at the Queensland University of Technology (QUT). 2019’s conference marked the second time the AoIR conference had been held below the equator, and the second time it had been held at QUT in Brisbane.

We were thrilled to be invited to sponsor the event alongside the conference’s other sponsors, including the Digital Media Research Centre (Queensland University of Technology), the Oxford Internet Institute (University of Oxford), Australian Community Managers, and Mozilla.

In their own words, the Association of Internet Researchers is a member-based, academic association dedicated to the promotion of critical and scholarly Internet research independent from traditional disciplines and existing across academic borders.

The annual AoIR conference is an interdisciplinary highlight of the research calendar for many researchers around the world. The theme of the conference for 2019, was ‘Trust in the System’, which provided an excellent framing for many of the presentations, examining everything from artificial intelligence (AI), to social media and beyond.

Attending on behalf of Max Kelsen and otso, I had the chance to attend dozens of interesting talks during the event — too many to do justice in our blog — but I want to highlight a few here:


Luke Stark’s presentation on the ethical implications of systems for quantifying or describing emotion in AI and other technology systems provided a thoughtful, critical look at the use of emotion in our industry. His upcoming book project on the Quantification of Emotion and Human Feelings promises to be an interesting and much-needed examination of the risks and assumptions that are often made in the computer-aided evaluation of emotion.

As he explains, these issues are particularly vexing, given an ‘under-theorisation’ of Emotion, the proxying effect that is required to attempt to quantify emotion, and the ethical dilemmas in managing emotional data as a source. (Consider, for example, if emotional data should be treated like behavioural data, social data, or physiological data?)


As social platforms like Facebook, Twitter, Instagram, and dozens of smaller platforms become more meaningfully entwined with everyday life and our traditional systems of politics, it’s important to look at the impact of the moderation practices of these platforms and their influences on social values and democratic processes.

The two ‘Content Moderation and The Power of Platforms’ panels, featuring researchers Tarleton Gillespie, Ysabel Gerrard, Robert Gorwa, Ariadna Matamoros Fernández, Elinor Carmi, Patricia A Aufderheide, Sarah T Roberts, Aram Sinnreich, Nicolas Suzor, and Sarah Myers West, provided a pragmatic, exploratory discussion of content moderation policies and practices beyond Facebook. An important element of this discussion involved rethinking, reframing, and engaging in substantive discourses around moderation. Asking; “What are the goals that content moderation (or research and policy intervention) should seek to address?” And “How do we agree upon the social goals that moderation should serve?”.

These questions are important, as the goals these questions seek to frame are currently set by the owners and operators of internet platforms. As the panel noted, we have a rich discourse about what it means to strive and reach for equity in public speech, practice and institutions. Consequently, the growth in significance and importance for digital platforms presents an important opportunity for researchers, policymakers and technologists to revisit the assumptions and the often US-centric legal or cultural norms that are packaged into globally popular internet platforms.


Susannah Kate Devitt, Monique Mann and Angela Daly’s presentation about their recently-published book ‘Good Data’ provided an excellent summary of the text — a helpful work of scholarship that problematises assumptions about ‘data’, and works to set forward key arguments to consider when considering the data that underpins any form of research or development.

As they argued in their presentation, the need for this book was presaged by the existence of too much ‘Bad Data’ — that is, unethical data gathering or usage, especially for unaccountable or discriminatory practices like facial recognition, or ‘robo-debt’, (i.e., the controversial Centrelink Debt Recovery system). ‘Bad Data’ is a political issue that particularly encompasses the ways in which industries and governments can and do use data that can marginalise or have significant impact on the marginalised.

In comparison, ‘Good Data’ is also political — it should be measured by the degree to which it is created and used to increase the wellbeing of society — particularly to empower the marginalised. It should be led by citizens and in turn, it should lead to further empowerment of citizens — and as such, it is integral to democracy. Given this, they argue that considerations like information security, encryption, and anonymity are instrumental in combating the use of data to disempower citizens. Where appropriate, data should be published, revisable, and help to form useful social capital.


There’s perhaps too much to say about my experience at AoIR19 in the space of short blog post like this, but I’d like to take a moment to strongly recommend it as a conference to attend.

The researchers were welcoming and thoughtful, and the interdisciplinary focus of AoIR allowed for an incredible melting pot of ideas, which offered attendees a fantastic opportunity for critical thinking about technologies and the lived experiences of humans in an era of profound technological change.

otso.ai
About UsCareersContact UsBlog
Products
Copyright © 2021 - otso.ai. All rights reserved.
Follow us:
LinkedIn LinkTwitter Link