What is the responsibility of a platform such as Facebook to its users?
This week, The Guardian published its investigation into the social giant’s content moderation practices, revealing through leaked documents the capricious nature of the task at hand.
The documents painted the company’s content moderation as something of a moving target, in which a split-second judgement is often required to decide whether something that is potentially offensive, graphic and even dangerous should stay on the site for purposes of awareness and education or removed entirely. Another cause for concern was how far the line went when it came to what threats that were posted could be considered credible.
It’s a complicated process, but when the engine of Facebook is the personal information that people share with it, it begs the question of what kind of transparency is owed to its users, especially around issues of safety and wellbeing.
Monika Bickert, Facebook’s head of global policy management, wrote an editorial for The Guardian in an effort to address those concerns and lay out context when it comes to the changing standards required to make those quick decisions.
With regard to that transparency, Bickert noted that while Community Standards are available to all, “we don’t always share the details of our policies, because we don’t want to encourage people to find workarounds.” But the boundaries around what is and isn’t appropriate remain blurry.
“These tensions — between raising awareness of violence and promoting it, between freedom of expression and freedom from fear, between bearing witness to something and gawking at it — are complicated, philosophical questions,” Bickert wrote. “Many organisations grapple with them, and there are rarely universal legal standards to provide clarity. Being as objective as we can is the only way we can be consistent across the world and in different contexts. But we still sometimes end up making the wrong call. … We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any.”
So what can any organization learn from Facebook’s stance?
While it’s understandable that you would want to keep proprietary practices under wraps, if a process is constantly evolving, take it upon yourself to keep your users or customers in the loop. Their insights could help refine a messy process or highlight issues that you may not see clearly from the inside. It’s also important to remember that admitting you’re wrong isn’t a death knell — it can put you in a improved position to do better going forward.