After years of growing dissatisfaction, the moderation problems of social platforms are beginning to end up on the bottom line. Last week, Facebook struggled to curb an increasing boycott of advertisers, with large companies like Unilever, Diageo, and Coca-Cola abandoning all advertising on the platform for the rest of the year.
On Monday, the panic spread to other platforms. Reddit has banned a number of subreddits, including the notorious community, as part of a new policy against identity-based hate. YouTube has banned longtime white supremacists Richard Spencer and David Duke and their affiliates. President Trump was temporarily banned on Twitch after reraded speeches were marked as inciting racial hatred.
Most of these prohibitions and guidelines have been around a long time, but there is a reason why they were all hit at once. For months, a coalition of civilian groups – including Color of Change, Sleeping Giants and the NAACP – have been pushing advertisers to end support for platforms that allow racial hatred to spread unmoderated. Referred to as #StopHateForProfitThe campaign originally focused on Facebook, but has increasingly focused on social media in general during a global movement against racist police violence.
The edge spoke to Color of Change's senior campaign manager Jade Magnus Ogunnaike about what the campaign expects technology companies to do and how they can prepare for the long haul.
This interview has been edited slightly for clarity.
So far, #StopHateForProfit has primarily pushed for advertisers to drop Facebook in particular. But on Monday we saw many other moderation movements that seemed to be linked to this campaign, and dropped prominent white supremacists like Richard Spencer and David Duke. Do you see this as the result of the success of the last campaign?
I would associate a lot of it with requirements that we had long before #StopHateForProfit. We have been pushing Twitter to throw Trump off the platform since 2018 because it is spreading violence and hatred. And it's the same on / r / the_donald subreddit. There are a number of others that were also torn down yesterday. This stuff has real ramifications for blacks and marginalized communities. That's why we met with the Reddit CEO as part of our coalition to discuss their policies. We still need all the details on how Reddit interacts with it and what its actual moderation structure is like. We haven't seen any internal documents either. However, your new policy is a big step for an important social media community standard to reject online hate through a civil rights lens.
The big problem overall – and it's a problem with Facebook, Twitter, Reddit, and Twitch – is how to build the infrastructure to do it all the time. You can make these small changes, remove subreddits, or label Donald Trump's tweets. However, until these social media platforms have a civil rights infrastructure and employees trained to rate products and content for bias and discrimination, they will continue to see these issues over and over again.
We think Twitch is a really good start. Twitch is the first platform to actually launch the president. Reddit is taking some good steps. But what's really important is that we have this civil rights infrastructure and we have people who are really trained to look at products and content and find the best ways to make them safe for black and marginalized communities.
I'm curious to see what you would say about a company like Facebook that has taken part in civil rights reviews and has this civil rights task force in the context of the upcoming US elections, but doesn't seem to be doing better on these issues.
So you are right. Facebook has carried out civil rights reviews, which were carried out at the urging of Color of Change and other civil rights organizations. I think the results should be published soon, but they haven't been published yet.
However, we do not need a task force for a company in this position. What we need are C-suite level executives with civil rights skills to assess how policies and products impact discrimination, bias and hatred of users. We need a person who is committed to ensuring that the platform's design decisions take into account the impact they have on marginalized communities and the potential for radicalization and hatred.
I should also note that the Facebook Task Force is not a dedicated staff. The task force dedicated to the election. But on November 4th this task force will be over. And we know that the hatred on Facebook will hit black people long after the 2020 elections.
How Sleeping giants said yesterdayCivil rights groups have been literally calling for some of these bans for years. Why is it happening now all of a sudden?
I think we are at a turning point for racial justice. You can chalk it up to a number of different events. Part of this is due to the way the government failed us and the companies failed us after the corona virus. But we also saw Eric Garner. We saw Sandra Bland. We saw Michael Brown. Black organizers in the movement have pushed our culture up to this point. Six years ago, when Michael Brown was killed, McDonalds would not be seen saying that Black Lives Matter. It was a radical statement. And the credit goes entirely to the organizers of the movement, who have pushed culture so far that we can no longer ignore the killing of blacks by the police and vigilantes.
Of course, companies still have a long way to go. We like to say that “Black Lives Matter” is great, but it is absolutely symbolic if black workers are fired for talking about racism, or companies like Facebook refuse to invest in civil rights infrastructure, or if Twitter triggers violence against demonstrators for speaking out. They undermine their statement with their actual actions, and that is exactly what the # StopHateForProfit campaign is about.
One of the concerns I hear about left-side deplatforming is that a game book is being created that can be used against activists. Color of Change is currently pushing Facebook to get rid of white nationalist groups. But then are you concerned that a few right-wing groups could use the same approach a few years later to ban Black Lives Matter groups?
When we talk about deplatforming, we intentionally look for groups and content that encourage violence and harm people. So, two years ago, you're looking at something like Unite the Right in Charlottesville. There was this massive white Supremacist march around these Confederate monuments, and the next day a woman was literally killed. So we don't want to de-platform people who simply have different opinions. We want to remove harmful information. And that can be vaccine misinformation or violent conspiracies like Pizzagate, which lead to people being sent to the pizza shop and shot there. These are not just people with different opinions. These are people who are violent and want to harm other people.
Richard Spencer really embodies the simple rebranding of white supremacy. Most people can understand why you don't want someone in a KKK hood to spread violent misinformation on your platform. That makes perfect sense for most people because the hood is so symbolic of hate, terrorism and violence. But because Richard Spencer suddenly wears a buttoned shirt, hard shoes, and gelled hair, he no longer becomes a threat, and we are encouraged to look at both sides of the problem. In reality, he is pushing for the same hatred and misinformation as the KKK. It has just been renamed in a new outlet.
I would also like to point out that black organizers on Twitter, Facebook and other social media areas are already constantly deformed. Black organizers who say "please don't kill us" or "defuse the police" are common moderated more than real white supremacists who say blacks are genetically inferior. So I don't think it's a fitting comparison at all.
Where do you think the movement is going from here? What should companies do to fix these problems in the long term?
There is no quick fix. And that's why companies quickly do the symbolic. You know we removed some subreddits or published a statement. It's a big step, but there are no quick fixes to overlap racism and capitalism. There are no quick fixes for companies like Reddit that have been shaped by racist culture from the start. We can't just cheer on the fast things. Companies actually have to undergo civil rights reviews. You need to examine how racism and discrimination show up at all levels of the company.
There is nothing you can do to fix racism in your company or to confirm that black life is important. What actually has to happen is that companies have to commit to a living wage for all of their employees, and then invest in civil rights reviews and gradually implement them to implement these changes. As much as the blacks wish we could snap our fingers and let companies confirm the lives of blacks – not just in death but also in life – that just won't happen. Companies have to be there in the long run.