OVERVIEW

Coming to CMU, I knew I had to get involved in social computing research. I've always been interested in human communication, and my recent experience in Japan had me fascinated with the way online communities grow. When I had the opportunity to study moderation, I knew I could contribute to a body of work that gives us better understanding of social groups online. My work is advised by Geoff Kaufman as part of the eHeart Lab at CMU.

Problem:

Is there more to moderating online communities than simply figuring out how to deal with misbehavior?

Solution:

I was part of a team of researchers that interviewed a total of 50+ moderators of online communities on Twitch.tv, Reddit, and Facebook to develop a model of moderation and community growth.

Project Details:

Academic research, 1 year duration, 3 team members

Methods

Semi-structured Interviews

Qualitative Coding

Responsibilities

Interview Question Writing

Participant Recruitment

Moderator Interviews

Data Analysis

Deliverables

1 Interview Protocol

15 Facebook Mod Interviews

1800+ Data Chunks

Literature Review

Research Paper Published

Our paper was recently published in New Media and Society, at a top communications journal.

PROCESS

Why a large-scale qualitative study of moderation?

Previous work on moderation has focused on punishing misbehaving users. As a consequence, tools exist to make it easier for moderators to time out, ban, or silence users who violate community norms. While work in automation is important, our research has focused on situating moderation within the context of community growth. Our results suggest that moderation is complex and social, and that algorithmic methods of punishment are simply not enough to fill moderator needs. We hope our work informs future work on the design of social platforms.

For this study, I worked closely with a PhD student on interview question design, conducting interviews, data analysis, and paper writing. We decided to use qualitative methods as our bread and butter for gathering data. Sure, interviews take time to conduct and achieving a high inter-rater reliability score can be incredibly difficult. Data can also be messy to interpret. Yet despite these challenges, we wanted to take an in-depth look at moderators in a way that hasn't been done.

It took close to two months using grounded theory to code 1800+ data points from our 56 interviews. While UX research in industry could never wait several months just for data analysis, our focus on academic rigor and argumentation also proved valuable as research experience. In industry, I hope to adopt a middle-ground between rigor and speed.