Page contents
Caroline Sinders
Date and place of birth
New Orleans, LA
Current location
London, UK
Year(s) of residency and/or fellowship
201617, Project Resident

Caroline Sinders is a machine-learning-design researcher and artist. For the past few years, she has been examining the intersections of  technology’s impact in society, interface design, artificial intelligence, abuse, and politics in digital, conversational spaces.

Sinders is the founder of Convocation Design + Research, an agency focusing on the intersections of machine learning, user research, designing for public good, and solving difficult communication problems. As a designer and researcher, she has worked with Amnesty International, Intel, IBM Watson, the Wikimedia Foundation, and others.

Sinders has held fellowships with the Harvard Kennedy School, the Mozilla Foundation, Yerba Buena Center for the Arts, Eyebeam, STUDIO for Creative Inquiry, and the International Center of Photography. Her work has been supported by the Ford Foundation, Omidyar Network, the Open Technology Fund and the Knight Foundation.

Her work has been featured in the Tate Exchange in Tate Modern, Victoria and Albert Museum, MoMA PS1, LABoral, Ars Electronica, the Houston Center for Contemporary Craft, Slate, Quartz, Wired, as well as others.

Sinders holds a Masters from New York University’s Interactive Telecommunications Program.

Artist Reflection: Emotional Data as a System

Caroline Sinders is past Eyebeam resident through a joint residency with BuzzFeed’s Open Lab. Her core project at the Eyebeam BuzzFeed OpenLab is a working prototype with Angelina Fabbro on around using commenting data from various journalistic entities and social media data to mitigate harassment using machine learning.

During Caroline’s residency at Eyebeam and BuzzFeed’s Open Lab, they explored ways of how to apply design to machine learning and personal data—here are three of the focus questions they have delved into.

1. Understanding machine learning through better design

Within machine learning, the data that informs the algorithms is constantly changing while the structure of the product/piece/space/thing remains stable. The barebones of the machine learning structure is not altered significantly since the structure of the algorithms and the data types remains constant. That said, the actual manifestation of that data often changes. That explanation was too long—I want to make it easier. Machine learning is a lot like dealing with organisms or nature: things are constantly shifting. I like to think of training machine learning like trying to teach a toddler to do something over a long period of time. Children can often hold onto arbitrary details, and machine learning can do that same thing from data sets. The organic nature of machine learning is so incredibly interesting and my open source templates and prototype will be experiments into how machine learning can be better understood by all different kinds of people. I’m calling this work ‘temperamental ui’ to explore the evolving nature of what happens to machine learning UI and UX when the data sets start to change from user interactions.

Surveillance Cameras Dan Phiffer and Caroline Sinders

2. Citizen surveillance of surveillance cameras

I’m photographing, identifying, and mapping all of the surveillance cameras in San Francisco neighborhoods by running a neural net over the gathered visual and geographical data set. I’m interested in finding correlations in demographics of surveilled areas. What will an algorithm pick up? Is it placement, is it location, is it colors, is it kinds of cameras? I started off my career as a photographer, and now I want to take a closer look at the cameras the fill our cities. Instead of taking photos from Google’s bird’s eye view, I’ll be trying to stay grounded and photograph them as people would encounter them—this project is part of a series of VR games called Dark Patterns that I am building with Mani Nilchiani.

Dan Phiffer and Caroline Sinders Small Data

3. Ownership of the data that we create

Right now, our data is owned by the platforms that we give it to—usually so they can resell it as big data to advertisers. What would an alternative model using “small data” look like, where everybody owned their data in a platform cooperative? Dan Phiffer, an Eyebeam Impact Resident, and I are building a small series of prototypes on raspberry PIs. We’re trying to work out the legal complexities of data collectivism. I’m excited to build something really small, something really really small, something that isn’t meant to scale up, and something that is meant to be egalitarian for users. It’s like making a zine for a systems designer.

Mani Nilchani and Caroline Sinders Surveillance Dark Patterns

A patchwork virtual space created by Caroline Sinders and Mani Nilchani. 


Eyebeam Project Residency with BuzzFeed Open Lab (2016 – 2017)

Eyebeam models a new approach to artist-led creation for the public good; we are a non-profit that provides significant professional support and money to exceptional artists for the realization of important ideas that wouldn’t exist otherwise. Nobody else is doing this.

Support Our Work