Social media giant Facebook has been marred with controversy in recent years over privacy and information security concerns. After testifying before the Senate in 2018, CEO and co-founder Mark Zuckerberg began developing plans for how content should be governed and moderated on Facebook, to act in accordance with national and international laws as well as public concerns. The company has now announced further details on the structure of a planned Oversight Board.
The board will consist of no less than 11 members, though it will eventually comprise of 40 members around the world, and will begin addressing cases in 2020. Facebook has called the board an independent body, though questions surrounding the ability of the board to be independent have been raised. This Oversight Board intends to “provide oversight of Facebook’s content decisions”, “reverse Facebook’s decisions when necessary”, and “be an independent authority outside of Facebook.” The fact that the board members will be paid via a trust funded by Facebook raises questions of its independence. In response to questions surrounding the board’s independence and Facebook’s funding of the board, Facebook has said the endowment of a separate trust helps to structure the board in such a way that Facebook cannot revoke resources in response to board decisions. The company claims that the trust will be opened for other networks to join and fund in the future.
According to information from Facebook, the board will have the discretion to choose which requests or cases it will review and decide upon. It will seek to consider cases that have the greatest potential to guide future decisions and policies. In certain cases of exceptional importance to real world consequences, Facebook will be able to send cases to the board for expedited review. Facebook will control which cases are submitted to the board, though the board members will decide which of those cases to accept. Users will be allowed to request a case review with a written statement, though many of these cases may not be considered if they do not meet the board’s criteria for review, which remain under consideration, though Facebook’s suggestions include cases “involving a potential for real-world harm, cases without precedent, high-profile cases with international implications, highly emblematic cases, and content that deals with freedom of speech, hate speech, or terrorist propaganda”.
Recent Stories