To get maximum benefit from the ICJS website Register now. Select the topics which interest you.
On invitation from Facebook, ECAJ co-CEO Peter Wertheim (centre) attended consultations at the company’s South Beach office in Singapore, together with other representatives from South Korea, The Philippines, Japan, Myanmar and Thailand.
ECAJ co-CEO Peter Wertheim travelled to Singapore as the guest of Facebook to attend a two-day consultation on February 20-21 on the establishment of an independent External Oversight Board which it is proposed will be empowered to make final decisions about contested content appearing on Facebook, and will put forward recommendations about Facebook’s community standards, policies and internal processes. There were a total of 38 participants from across Asia, Australia and New Zealand, including academics, media workers and civil society representatives. This was the first of a small number of consultations which Facebook has scheduled with participants from various regions in the world.
Facebook currently has 23 categories of community standards which may require it to remove a range of content including hate speech, credible threats of violence, terrorist activity, inciting or advocating physical harm, bullying and harassment. Each day Facebook receives an average of one million notifications from its 2.3 billion users all over the world reporting content that is said to breach its community standards.
Facebook currently employs 30,000 people to deal with the formulation and enforcement of its community standards, half of whom are tasked with reviewing specific complaints about content. The company has also developed, and is continually improving, artificial intelligence to detect and remove prohibited content electronically, regardless of whether a complaint has been made. Due to variations in language and culture, and difficulties in assessing precise nuances of meaning, the electronic detection of hate speech is especially challenging. Nevertheless Facebook has had a growing success rate in dealing with this problem.
The meeting considered and discussed several real-life case studies. Participants then debated and put forward recommendations about the precise functions of the External Oversight Board (EOB), the number of EOB members, their term of office, the process for selecting members and the criteria for selection, caseload, criteria for selecting cases to go before the EOB, the processes for assessing and determining cases, the transparency of the EOB’s decisions, maintaining consistency of decision-making, protections against infringements on the EOB’s independence, indemnity against loss, EOB recommendations about changes to Facebook’s policies and processes, and numerous other questions.
A final decision about all of these matters is expected to be announced by the end of 2019.
Commenting on the experience, Peter Wertheim said: “This was one of the most challenging meetings I have attended. The ECAJ has had its differences with Facebook in the past, so full credit to them for inviting a representative from the ECAJ to participate. The creation of an External Oversight Board is a hugely complex and ambitious undertaking and I commend Facebook for the initiative. A final judgment about the proposal will have to await the decisions which are ultimately made about the range of matters we discussed. However, I would like to think Facebook is doing this for the right reasons and that it has matured in its outlook, has a clear view of its moral and legal responsibilities about problematical content and takes those responsibilities very seriously. The ECAJ looks forward to our expectations being confirmed and exceeded”.