“Fundamentally,this is an experiment,” he says. “We don’t know what the best way to do it is. We don’t necessarily know how to make sure Facebook listens to us and that we can actually hold them to account. No one’s really tried this before at this sort of scale.”
Suzor talks a lot about holding the social media giant to account. He sees that as his primary role on the board;not only to hold Facebook accountable to its own policies on content and speech but to society’s broader expectations and human rights norms. Facebook’s policies have “a long way to come”,he says.
“Personally – not speaking for the board – I think there’s a mess all the way through the system,” Suzor says. “They often misidentify critical speech:speech by minorities and marginalised people trying to speak to power. These are often the people who get silenced because Facebook doesn’t really know how to tell the difference between important critical speech and other forms of speech that might on the face of it look like abuse and harassment.
“They’re also failing to protect people from the torrents of hatred,racism,misogyny and religious vilification that exist on Facebook. It’s not hard to find – they get promoted through our news feeds and search results. I think,ultimately,Facebook is going to have to rethink how it does all of its moderation to make sure it is able to do both of those things simultaneously.”
Suzor says there are varying views on the board about how Facebook should balance freedom of expression with protection from harm,and about the efficacy of censorship,and those big debates are mostly still to come. While others on the board are not First Amendment free speech radicals,“some people might view that more centrally than perhaps I do”,he says. “If anything,I pay a little bit more attention to those sorts of concerns[about racism and vilification] than some of the other board members.”
The Oversight Board pursues depth or quantity. Facebook receives thousands of complaints and appeals every day,but the board has made only 10 decisions so far,including last week’s ruling on Trump. The board has staff who find the most interesting cases that can set precedents and probe themes,a bit like the medical TV dramaHouse. “I’m not sure about the analogy,” Suzor says. “I do like Hugh Laurie though.”
Typically,when the board takes a case it will assign it to one of three panels (a fourth panel works on case selection). There are five people on a panel. They start by deciding what information they will need from Facebook,what questions they’re going to ask partner organisations and language experts,if necessary,and contract out for contextual advice about the region,culture and politics.
“In all of our decisions we’ve been stressing that context is really important to understand the meaning of what people say,” Suzor says. “Often these cases turn on a turn of phrase.”
So far the board’s cases have involved the removal of a post containing an anti-Azerbaijani slur (upheld),the removal of an Instagram post in Brazil involving nudity that was actually made to advance breast cancer awareness (overturned as soon as the board took the case) and the removal of a post involving blackface caricatures in the Netherlands (upheld).
Then there was Trump.
Loading
Facebook suspended the then president after removing two of his posts during the Capitol riot in January,including a video in which he said supporters should go home but repeated false claims about widespread voter fraud.
“I know your pain. I know you’re hurt. We had an election that was stolen from us,” he wrote.
On January 21,Facebook announced it was referring its decision to the Oversight Board. On Wednesday the board finally ruled,upholding Facebook’s decision but saying the suspension could not be indefinite. Facebook now has six months to decide a “proportionate response”.
“It was necessary for Facebook to take some action on January 6,but it had no clear policy to decide how long a suspension should be for,” Suzor says. “Facebook doesn’t have rules about how it suspends people. That’s crazy. We didn’t come to a conclusion about what Facebook should do,we just said they’ve got to figure it out.”
The decision has been criticised from all sides. Trump allies were outraged,many Democrats felt it wasn’t strong enough. The man himself called it “a total disgrace” and said Facebook would pay a political price.
Suzor describes the ruling as strong,nuanced and rigorous. He is surprised that the board reached a consensus on such a controversial matter,even if there were pockets of disagreement. He reminds people it was only in May last year – at the same time Trump said of Black Lives Matter protesters that “when the looting starts,the shooting starts” – that Zuckerberg was telling reporters social media platforms should not be “the arbiter of truth” in politics.
Zuckerberg’s remark enraged Suzor and others. He felt it was a missed opportunity to do what Facebook eventually did – arguably “too late”:suspending Trump and taking a stand against incitement to violence.
Suzor says last week’s ruling should demonstrate that social media companies really can make a determination – in a considered and reasoned way – about what is acceptable on their platforms.
“When Zuckerberg says we don’t want to be the arbiter of truth,this decision shows it is possible to weigh up the evidence and come to a conclusion,” he says. “And I think that is such an important thing to do.”
with Reuters