Facebook’s handling of content published on its platform has been discussed in academia and among the public at large for quite some time, especially given the spread of so-called fake news or hate speech. Following the German legislator’s action with the Network Enforcement Act (NEA) and similar regulations which are being discussed at European level in regard to the Digital Services Act, Facebook has now also taken action. With the Facebook Oversight Board (FOB), the US company has created a body that, as an independent supervisory authority, is designed not only to review the current decision-making practice in dealing with published content, but also advance proposals for improvement. Following its announcement in 2018, the FOB’s development has been monitored by experts from 88 countries. Around 1,200 public submissions with suggestions on the FOB’s design were considered before the body’s final statute was published in September 2019. Although to some extent still under construction, the FOB officially began work on 22 October 2020 and announced its first decisions on 28 January 2021.
In its first year of operation, the FOB has grown from 11 to 19 members, but it still remains well below the target of 40 members as set out in the statute. These members should have broad knowledge of presenting online content and be familiar with digital aspects. However, the statute’s goal has been met, which is to include members in the body who “have and demonstrate a broad range of knowledge, skills, diversity and expertise” and reflect the “diversity of the Facebook community”. Thus, the members are spread over 16 different countries, with the United States being the only country of origin with multi-representation, namely four members. European Union member states are represented by a total of three members from Denmark, Hungary and France. Of the nine women and ten men on the board, fourteen have a legal background, according to the FOB’s own information, but journalists, media scholars, politicians and activists are also represented.
The FOB’s core task is to make decisions as a second resort on the choice of content, either following appeal by users in the context of a complaint or when Facebook itself submits cases to the board. It checks whether the decision made by Facebook employees during the presentation process coincides with Facebook’s content guidelines and values. Furthermore, the board may issue statements that include recommendations regarding Facebook’s content policies. The FOB focuses on the area of expression law. Other sensitive and much discussed areas, such as Facebook’s news feed ranking or political advertising, do not fall within the FOB’s remit.
Looking at the first working year’s statistics, 21 cases were accepted and 18 decided. A case with German involvement, though, has not yet been heard by the FOB this year. In eleven of these cases, Facebook’s original decision was not upheld by the FOB. Four cases were submitted to the FOB by Facebook itself, including a decision on whether or not the blocking (so-called de-platforming) of former US President Trump’s account was permissible under Facebook’s community standards. The remaining 17 cases trace back to users’ complaints. To a large extent, decisions concerned hate speech, but the FOB also dealt with cases involving incitement of violence or the attempted sale of “regulated goods” such as drugs.
The FOB’s decision on the compatibility of Donald Trump’s de-platforming with Facebook rules has certainly received the widest media attention. This is despite the fact that it was an atypical case that Facebook did not necessarily have to submit to the FOB. In addition to this prominent case, though, the FOB has also decided a number of other exciting cases. For example, on 28 January 2021 a case was featured that is also significant from a German point of view. In the run-up to the US presidential election, a user shared a quote attributed to Nazi propaganda minister Joseph Goebbels. This quote included the thesis that it is more effective to appeal to voters’ emotions and instincts than to their intellect; moreover, truth must be subordinated to tactics and psychology. While the user wanted to draw a comparison to Donald Trump’s presidency with the quote concerning electoral campaigning, Facebook deleted the post with reference to its guidelines on dangerous persons and organisations. Facebook justified the decision by arguing that the user had not made it sufficiently clear that he did not support Joseph Goebbels. However, the FOB did not share this view and ruled that the post had to be reinstated, a decision which was based primarily on the lack of compatibility of Facebook’s decision with international human rights standards. It criticised the vagueness and indeterminacy of Facebook’s rules and found that the decision was disproportionate due to a lack of context sensitivity.
Calling on international human rights standards, either to supplement or even replace Facebook’s own guidelines and values, is a tendency that not only characterises, but also reflects other FOB decisions. Incorporating these standards could be an important step towards reinforcing the board’s separation from Facebook in the future and securing its independence. Moreover, it could help the FOB to develop globally uniform standards for what may and may not be communicated on Facebook, if such a global solution can be found at all.
Ever since Mark Zuckerberg announced his intention to set up the FOB, the project has been facing criticism. Above all, accusations have been made that it is merely a diversionary manoeuvre to prevent more far-reaching regulation, such as that which could be imposed by the DSA. It is also argued that the FOB’s levels of freedom and influence on Facebook are too weak to bring about lasting changes. This is especially so because important issues, such as the use of algorithms, are not covered by the FOB’s mandate. Furthermore, it is questionable whether or not Facebook will implement all of the FOB’s decisions. There is a risk that Facebook will ignore any unwelcome FOB decisions or abandon the entire board. In a way, Mark Zuckerberg represents the FOB’s constitutional authority and hence future developments will be fascinating to observe.
However, despite any points of criticism, it should not be forgotten that with the FOB’s introduction users for the first time have an internal legal remedy at hand through which they can take action against Facebook’s decisions, without having to seek redress through the state courts. This is also reflected by the majority of cases decided by the FOB having been such complaints and not just cases submitted to it by Facebook in its own interests. The fact that the FOB “overturns” Facebook’s original decision in many cases is also a good sign for users. In this respect, introduction of the FOB is in any event a step forward compared to the opaque status quo and thus promises well for the FOB’s future development. Following recent revelations, it is also a positive sign that the FOB decided to meet with whistle-blower Frances Haugen. The board wants to use her experiences and insights to set a stronger tone regarding transparency and accountability. In doing so, though, it is important to keep in mind that while the FOB as an internal compliance mechanism can improve the protection of users’ rights, it can only complement and in no way replace state legal protection.
The blogs published by the bidt represent the views of the authors; they do not reflect the position of the Institute as a whole.