I won’t be exaggerate if I were to say why I enjoy doing interviews. And it’s not for the sake of doing it, but rather bringing a certain amount of value to what you’re about to read. My goal here is not to educate or force my opinion on anyone. My role as a journalist, interviewer or just a film critic is to ensure that after reading you must be able to get enough information to process and form your own opinion especially when the subject it tackles is intimate, important but also painful to digest.
It took me a while to process everything “The Cleaners” had to offer. It had no compassion, would not go back and forth to find the easiest way to present the ugly truth of what we are and who we are. Indeed, to have a subject in a documentary about what content must go up and what cannot, we, as responsible and a law-obedient citizens must think carefully of what we wish for or what we ask for. Because at the end of the day, every expectation may fail or succeed. But only one thing will remain unchanged – and that’s human nature.
As part of Hot Docs Film Festival I had the great pleasure to discuss what privacy means, security, social network, the content we share, and why the Content Moderator’s job is so important. Sarah T. Roberts, an assistant professor in the Department of Information, researcher and a university professor who studies at UCLA is with whom I am extremely happy to have my interview with. Because if there is a person who would be able to come up with the best answers or even point of view, it could be only her and no one else.
MOVIEMOVESME: Being a content moderator is a deeply troubling job that can have a lasting impact on so many levels. Yet it’s a very important job to let someone moderate us. How do you see it?
Sarah T. Roberts: So I mean, it’s definitely a complicated kind of role that these workers have, and you just gave a great example of how their situated, often between some really divergent and even opposing forces. On the one hand, they are often charged with not being so overzealous such that they impede on peoples’ ability to freely share and express ideas. On the other hand, they have a role often to protect people, and usually what their primary mandate is, is to protect the platforms from liability as well. So, they’re often balancing these roles and responsibilities, some of which actually conflict with each other, and they therefore, have to decide what is most important. What is the best choice they can make, and it may be in a situation that doesn’t have a lot of good choices.
So, as you pointed out, their work tends to put them in a situation where they get exposed themselves to the kind of stuff most people wouldn’t ever care to see. And so, in addition to being kind of a highly – kind of like a speed-based, highly-productive, there’s a lot of expectations on them kind of work, they have this other element of having to balance the stress, just the psychological stress of what they see on top of just having a demanding job. And, the final factor is that they’re often pretty lowly paid, and they don’t have a lot of support, mental health support or other kinds of support for the work that they do. So it’s very challenging.
MOVIEMOVESME: I was in New York for the Tribeca Film festival, and as I was walking around Times Square area at 5:30 a.m., I found so much trash. I could easily compare the street, its cleanliness, or its dirtiness, with what we’re doing with the social network. Do you think I’m right, or do you see differently?
Sarah T. Roberts: Well no, I think there really is a comparison to be made, in a lot of ways. So as you pointed out, it’s 5:30 in the morning, which is why you saw those people. You might not have seen them at other times of day, in other words, the workers who do the cleanup, the cleanup crew, you know we know if you work in an office building the workers come to clean up at night when everyone’s gone home. It’s the kind of work that is hidden from most people’s view. And, frankly, you don’t necessarily notice when you come into an office that the trash is empty. But if it started overflowing, you’d notice that.
And, there’s a similar thing going on with these workers online, where they have really been, a key element of the work that they do is that they do it in such a way that it’s not perceived. So, it’s almost like we have an expectation that we can go online without being disturbed or upset by something really horrific. But that’s maybe not the natural state of things. It’s like that because these workers are constantly vigilant, and constantly searching for such material to remove it.
And yet, we, the average person, doesn’t know that’s going on, and we can’t really value that work, because we don’t even see it. But by that same token, if they didn’t do that work, we’d miss it, probably very quickly.
MOVIEMOVESME: How did you become a part of this film?
Sarah T. Roberts: Sure. So a couple of years ago, I was invited to Berlin to give an academic talk on my research, and I’d been studying these commercial content moderators for eight years, the phenomenon and the workers themselves. And so, in the spring of 2016, I went to Berlin to share the results of that research, and I met one of the directors, Moritz Riesewieck, who was really, really interested in my research. And he came from a theater background, so he was not a university researcher, he was not an academic, he was someone who was an artist, quite frankly. And he had already started investigating this topic from that artistic perspective, so we ended up sharing a stage a couple of times while he was in Berlin, where I would present my academic perspective on this phenomenon, and he complemented it with his artistic take.
And at that point, he had visited the Philippines, and was making some really interesting connections between, its historical and cultural position, vis a vis Catholicism for example. And the work that these folks were doing. So he had this kind of, for me as an academic it was very strange, and exciting, and inspiring to see a person from really another kind of expressive domain taking on the same topic, because he was able to see things in a way, sort of present things in a way, that was beyond my own skillset. I find that I’m not poetic, I’m not an artist, I’m just not, I’m a different, I have a different kind of relationship to this topic.
So to see him artistically interpret it was really amazing. And at that time, this was just a couple years ago, he started having this idea that his approach could be turned into a documentary film. And so, I’ve sort of been in touch with him since the beginning of his idea to transform his own vision and his own take on this, into the format that you’ve now seen that is “The Cleaners.” And of course he has his co-director, Hans Block, who works with him. So I’ve known those guys for a couple years now. And it’s been fascinating to see this process, and it fascinates me to see the film, which I think is profound. And which I find quite moving.
I was just telling them, you know I’ve seen the film at least four times now. And I just saw it again the other night in Toronto. And I started crying, you know? I know the film, I know the stories. I myself have collected similar stories over the past eight years, and yet, that format and that immediacy of the documentary format and cinematography is great. The protagonists in the film are amazing. It just kind of moved me. So, that’s sort of the power of that medium, and that’s the power of having this work translated, artistically. And it’s really been a privilege to kind of, be alongside that, and serve in an advisory capacity, and provide background and be interviewed, and so on for the film itself.
MOVIEMOVESME: I saw the film once and it made me feel sick to my stomach. It’s just so unbearable to watch but in a good way. And I saw so many nightmares afterwards, I’m not kidding. It took me two or three days…
Sarah T. Roberts: Yeah, I know
MOVIEMOVESME: With all this talk about social media and everything, do you think that social networks itself like Twitter, Facebook, Reddit, all the platforms, are the best gift to humanity, but we don’t know yet how to use it properly?
Sarah T. Roberts: I think that we are at a moment right now, as we speak, of great collective questioning about the role of those platforms. Whether its, what role and impact should they have in our political life, what role and impact should they have in our social life, interpersonal life – I just saw yesterday that Facebook announced that it had a big dating initiative – it wanted to kind of be involved in peoples’ dating lives. And the reaction hasn’t been great to that today. So I think that, they kind of had, these networks have kind of had a decade or so of really unfettered and unquestioned growth. And they have hit a point where the public is no longer just going to accept whatever comes along. They really want to think it through. So in that sense, I think, this is work that I’ve been doing for eight years, as I said. I find myself having a much easier time starting the conversation from a place of okay, not everything is great about social media, and about our investments in social media. And the way that we spend our time and energy and effort. And that’s a conversation that I can have differently than even just two years ago.
I think the movie is also really timely in that sense. We have a lot of alternatives. We have a lot of other models and other ways. Even when I was in Toronto on Tuesday night, talking to the crowd at the theater, I said, “look, we’re sitting here in Toronto, it has one of the most robust public library systems that I know of, 100 branches in the GPA.” And that’s a great model for information dissemination and access, that sits somewhere else. It does not sit in the hands of a private corporation. It is not controlled by shareholders. It’s flawed, it has issues, it’s not a spot with history. But it’s a different kind of orientation, and it offers up opportunities for everyone to feel like they have a stake, in a way that I think the social media industry has really, erred. It has not been transparent. It has not made people feel like they understood how to act and be a part of it, or how their safety and privacy was being safe-guarded. All of those things are principles that an institution like a public library understands and is able to articulate.
So I think we have other models, and there are models to come. People who will invent new ways, people who will rethink the status quo, and will offer up alternatives. I’m not a programmer, I’m not a developer, I’m probably not the person who’s going to do that. But I’m hopeful, and I do think the moment is ripe for those kind of things. I also think it’s a great moment for the industry leaders to kind of rethink and reflect. I think they’re trying to do that, unfortunately it’s not always really successful, and just kind of getting started on what really needs to be a long-term process. We have to wait and see, and we have to, as I said the other night, we can’t just relinquish our agency and creativity, and ability to come together collectively, because we are in awe of technology. Those are things that human beings are building and they reflect certain values. If those aren’t the values that mean something to you, what can you do about it? How can we agitate to get something else?
MOVIEMOVESME: So many people want to be protected from any type of harm or violence, but no one wants to give up their privacy. For instance, law enforcement had for over 40 years tried to capture the Golden State killer, who ended up being the police officer. He was captured through an ancestry website where he shared his DNA. So where’s the right balance between privacy and security? Why can’t we just understand that sometimes we have to give up that privacy in order to be protected? What is your take on that?
Sarah T. Roberts: Well I think that this is an issue of, that is one of calibration. It is one of balance, and is one of tension. So you have, as you very well put it, a sort of these poles competing, to see which side will sort of win out in a given context. And of course, typically I’m talking about the American context, or the Canadian context, or certainly the Western context. And that’s a very limited scope. And even in that context, the political situation can change on a dime. So when people talk about what’s the alternative to privatized Facebook, and some people say it ought to be nationalized, some people say, “God, the last thing that I want is the state to have that kind of power over my information.” I think one of the biggest problems we have right now is the control over those tools and of the information itself, is very, very concentrated. Whether it’s in hands of law enforcement, and really no one else, or whether it’s in the hands of private corporations and no one else.
So I think, some of those tensions could be eased by what I was talking about before, which is just a democratization of options. So that there isn’t only one kind of place to put your data, or only one kind of place to do something with it. Of course, the other piece that has to come with it, is people’s own media literacy and data-literacy, which is something that we have to kind of socially build on. I’ll tell you, in the case of Golden State killer, I actually have followed that case obsessively, for many, many years –
In a weird twist, yeah. And so, I mean for years, posted on message boards and all kinds of stuff. So, this issue of how he was captured, it just points to the need that as we develop technological tools, as we’ve been doing so rapidly, these just incredibly, almost unprecedentedly-powerful data tools over the past couple decades. As they’ve been developed, there hasn’t been an equal development, or an equal related, I don’t know, increase in, thinking through the effects. But that’s sort of always been an afterthought. And so, I think cases like that, cases like “The Cleaners,” others, point to the fact that it’s not enough to just technologically-innovate anymore without consequence. Somebody was just recently quoted, and reporting out on this horrible, predictable, policing project that’s like out of a bad sci-fi movie. This guy was presenting on it, and people really had problems with it. And he like, “I’m just an engineer.” And I don’t think people are buying that anymore. It’s not okay to say, “I just build this stuff, I don’t think about how its used, or how it could be used down the road.”
So that kind of technological effects, is something that we’re going to need more of, and that does exist. But it certainly hasn’t had the prominence or equal weight of the technological developments themselves.
MOVIEMOVESME: The movie you were talking about is “The Minority Report,” with Tom Cruise. So right now, in Chicago, they are using a system where an algorithm with the data tries to predict who will get killed or who will kill. I actually like that idea, to be honest. I don’t mind if someone has access to my data, I’m not doing anything wrong, as long as that helps tomorrow with any type of investigation, please have it. But that’s how I think, anyway.
Sarah T. Roberts: Yeah, I mean different people have different levels of risk associated with their own identities, right? So unfortunately, those things aren’t treated equally by just saying – I mean this is where data scientists have let us down. Because they do treat all data as being equal, in the sense that they abstract the context. And I’ll give you an example. In this kind of predictive policing apparatus, there’s a particular researcher who builds these tools based on a database of gang affiliation. The gang affiliation database itself has been proven again and again to be flawed and inaccurate, racist, biased, etc. What happens when that is serving as the basis for data analysis, and through the tool itself is even just erased. Like the actual source of the sorting of the actual information disappears because its rendered into this neutral space through the tool.
Which is really what the content moderators are doing, right? They’re rendered, they’re supposed to be acting “neutral,” invisible, they’re not supposed to be perceived, so that the decision-making processes that have gone into the outcomes don’t even get questioned. And I think that, in and of itself, is such a huge social danger, and we don’t even know the extent to which we’re going to need to be accountable to that in the future. So, we’ll see.