Exploring Ethical Trade-Offs in Social Media Research
Organizers: Casey Fiesler, Stevie Chancellor, Katie Shilton, Jessica Vitak, Michael Zimmer
Over the past decade, social media and user-generated content platforms have increasingly become rich research sites for the study of both computation and human behavior. This new source of pervasive human data has also sparked discussions within the research community about ethical challenges, and high profile examples have raised public awareness of ethical challenges as social media scholarship gains greater visibility. However, the research community lacks clear norms, and disagreement often comes down to how to identify and weigh potential benefits and harms. The goal of this workshop is to explore the most pressing ethical dilemmas within social media research, and how the ICWSM research community can best consider the ethical implications of our research and methods without compromising important work. Workshop participants will have the opportunity to shape a set of working guidelines to help researchers think through the ethics of social media research methods. [Full Proposal]
This will be a half-day workshop taking place during ICWSM 2018 in Stanford, CA on June 25, 2018. We invite proposals from researchers from both academia and industry, and would welcome a wide range of ethical and disciplinary perspectives, and topically-relevant/domain specific issues. We would also welcome position papers that argue for the benefit of certain types of research outweighing potential harms.
Submissions Deadline: April 5
Notifications By: April 19
Workshop: June 25
Targeted Areas of Interest: In addition to broad areas of interest, we also invite participants who want to engage with the intersection of ethics and the following areas:
- Transforming informed consent to the social sphere
- Anonymization of data and emergent privacy issues
- Legal implications and obligations
- Algorithmic accountability
- Fairness and transparency in machine learning and computational social science
- Societal implications of social media research
- Research on sensitive and vulnerable populations
- Ethical implications of data mining
- Methods selections and ethics
- Industry and academic research
We will accept submissions in the form of either formal position papers or lightweight statements of interest. Submissions should be no more than 4 pages (no minimum) and can be formatted in any style (though please submit a PDF). Submissions should be on the topic of ethics in social media research, including but not limited to:
(1) studies or works-in-progress;
(2) description of a particular approach to ethics, supported by your or others’ work;
(3) cost-benefit analyses of particular research;
(4) case studies of ethical challenges faced in your own work;
(5) reflection on any of the topics mentioned above, or others related to the intersection of social media and ethics; or
(5) any statement of interest on the subject matter and why you would like to participate in the workshop
AAAI has discontinued publishing workshop proceedings, so accepted submissions will be shared only on the workshop website (or only with workshop participants, at the authors’ request).
Our intention is to make this workshop as inclusive of different ideas and experiences as possible! If you are interested in all at having conversations about these issues but do not have the bandwidth or material for a full position paper, please consider sending in a statement of interest.
Submissions should be emailed to firstname.lastname@example.org by the end of April 5, 2018.Also please feel free to contact the organizers at this email address with any questions.
Casey Fiesler is an assistant professor in the Department of Information Science at University of Colorado Boulder. She holds both a law degree and a PhD in human-centered computing, and her research focuses largely on forms of governance online, including social norms, law, and ethics. She has organized a series of research ethics workshops at conferences including CSCW, GROUP, and ICWSM. She is a member of the SIGCHI research ethics committee and part of the NSF-funded PERVADE (pervasive data ethics) project.
Stevie Chancellor is a PhD student in Human Centered Computing at Georgia Tech. Her research focuses on computational methods, like applied machine learning, to study deviant mental wellness communities online. Her community of interest currently are pro-eating disorder communities. She is also interested in research ethics for large-scale data analyses. She helped organize the 2016 ICWSM ethics workshop.
Katie Shilton is an associate professor in the College of Information Studies at the University of Maryland, College Park. Her research explores ethics and policy for the design of information collections, systems and technologies. Current projects include leading the multi-campus PERVADE: Pervasive Data Ethics research project; exploring privacy-sensitive search for email collections; analyzing ethical cultures in computer security research; and building tools to facilitate ethics discussions in mobile application development.
Jessica Vitak is an assistant professor in the College of Information Studies at the University of Maryland and Associate Director of the Human-Computer Interaction Lab (HCIL). She currently evaluates challenges to networked privacy, data ownership, and ethics in social computing research (NSF Awards #0916019 and #1704369). She has organized numerous workshops for CSCW and CHI on topics related to privacy and ethics.
Michael Zimmer is an Associate Professor in the School of Information Studies at the University of Wisconsin-Milwaukee (USA), where he also serves as Director of the Center for Information Policy Research. He is a privacy and internet ethics scholar, whose work focuses on digital privacy, the ethical dimensions of social media & internet technologies, and internet research ethics. Dr. Zimmer is a co-chair of Association of Internet Researchers (AoIR) Ethics Working Group, and a principle investigator in the PERVADE: Pervasive Data Ethics project.