If there was any doubt about how big Facebook has gotten, CEO Mark Zuckerberg has now compared his company’s struggles over what kind of content to allow on its platform to the challenges cities face in mitigating crime.
“No one expects crime to be eliminated completely, but you expect things will get better over time,” Zuckerberg said in a conference call with reporters today.
That’s not the only government analogy Zuckerberg made. During the call, he published a 4,500-word note about the issues the platform has historically faced in policing content on its platform, as well as where it hopes to go from here.
In this document, Zuckerberg announced that the company is finally taking steps to create an “independent oversight group” next year — something he’s previously compared to a supreme court — that users could petition if they wish to appeal decisions made by content moderators.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
There’s no question that when you’re dealing with billions of users publishing tens of millions of photos, videos, and text each day, getting more people involved in deciding what should and shouldn’t be allowed will be helpful. However, Zuckerberg stressed that “there are a lot of specifics about this that we still need to work out.”
Facebook hasn’t clearly established exactly what it feels a quasi-independent body could do that it couldn’t do on its own, and this announcement suggests the company is looking for a way to give users someone else to blame when things go wrong.
Let’s first take a look at how Zuckerberg describes the team Facebook currently has working on developing the policies of what is and isn’t allowed. From his Facebook note:
The team responsible for setting these policies is global — based in more than 10 offices across six countries to reflect the different cultural norms of our community. Many of them have devoted their careers to issues like child safety, hate speech, and terrorism, including as human rights lawyers or criminal prosecutors.
Our policy process involves regularly getting input from outside experts and organizations to ensure we understand the different perspectives that exist on free expression and safety, as well as the impacts of our policies on different communities globally. Every few weeks, the team runs a meeting to discuss potential changes to our policies based on new research or data. For each change the team gets outside input — and we’ve also invited academics and journalists to join this meeting to understand this process. Starting today, we will also publish minutes of these meetings to increase transparency and accountability.
When asked during the call what types of people Facebook would like to put on this independent oversight board, Zuckerberg and Facebook’s global head of policy management, Monika Bickert, said they hope to staff this group with academics and experts who have experience dealing with issues of hate speech, freedom of speech, and other relevant topics.
But if the independent board will be made up of individuals with expertise comparable to those on the internal policy team, what improvements are they expected to bring to the table?
The idea is that this board could make decisions about what type of content to keep up or remove without regard to Facebook’s commercial interests. But within even the loosely established parameters Zuckerberg and Facebook have set for this independent body, it’s difficult to see how that would work.
For example, Zuckerberg stressed that Facebook would still handle the initial decision about whether to take down a piece of content or an account, and would also handle the first appeal. If a user appealed a decision and was dissatisfied with the outcome, then they would have the ability to lodge an appeal with this so-called supreme court. In other words, the board couldn’t make Facebook take down content that hasn’t already been flagged by internal review.
In this scenario, Facebook wouldn’t need to worry about appointing people who could harm its commercial interests because parameters would ensure that they could do very little to oppose such interests — the panel would only be able to respond to individual user requests, not push for sweeping changes.
I’d be more receptive to the idea that an independent board might make for a fairer content moderations process if Facebook had been more upfront about its own shortcomings — but so far it hasn’t been.
With regard to allowing Russia-linked trolls access to the platform, Facebook’s repeated talking points have been that “it was too slow to act.” Zuckerberg infamously dismissed the idea that fake accounts and disinformation may have affected the 2016 U.S. presidential election as a “crazy idea.” And Facebook has struggled over the past year to explain the rationale behind some of its content moderation decisions, initially saying that notorious troll Alex Jones hadn’t violated its policies, only to remove his account after companies like Apple and Spotify banned his podcasts.
When asked today by a reporter why he thinks he’s still the best person to fix Facebook, Zuckerberg responded, “I think we’re doing the right things to fix the issues … I don’t think me or anyone else could come in and snap their fingers and have these issues resolved in a quarter or half a year.”
Facebook’s standard response to criticism has been that it’s operating as well as anyone could be expected to when dealing with problems of foreign propaganda and hate speech exacerbated by technology.
The company has refused to concede that issues with its business model or executive hires may have exacerbated these problems — and it has rejected the notion that being broken up or regulated might go some way toward mitigating them. But willingness to accept brutal criticism is needed if an independent board is going to have more than a cosmetic effect.
It seems to me that the push to create an independent oversight board is designed to accomplish two things — first, to try and avoid government regulation for as long as possible. And second, to give trolls someone else to gripe about when they object to Facebook taking down Alex Jones’ accounts or “shadow-banning” conservative news sites.
Facebook needs help from an independent body to correct its course, but unless the so-called independent board is given teeth and carte blanche, it is unlikely to be up for the task. If Facebook isn’t careful, further course correction could come from governments in an anti-monopoly or privacy regulation akin to GDPR.