RE: Case Number 2020-005-FB-UA
We thank the Oversight Board (the “Board”) for the opportunity to contribute comments as you deliberate these important issues around the future of free speech and social media. Recognizing the Board’s unique challenges, our comment discusses aspects of the case on appeal as well as more general aspects of the process of public participation in these proceedings.
This comment focuses on four points. First, the information provided to solicit public participation is insufficient to allow for meaningful discussion of the nuances surrounding the subject posts. Second, the “Dangerous Individuals and Organizations Policy” (the “Policy”) is too broad when applied to historical figures. Third, the Policy is inconsistently applied, rendering it ineffective and counterproductive. Fourth, retroactively applying the Policy to a “Memory” undermines Facebook’s credibility in content moderation.
I. Case Descriptions Have Insufficient Detail
The public was provided a 113-word description of the subject post, told the violating content was a quote which the Board summarized, and given a paraphrase of the user’s argument on appeal. Without more context, we cannot determine the user’s intention or the impact of the post. Instead, the public must rely solely on inference and speculation. This weakens the likelihood that the Board will receive substantive responses that it may rely upon to make meaningful policy recommendations. In order to receive substantive responses through this comment process, as much detail as possible should be provided. In this case, we should be able to view the exact quote, its visual depiction, any accompanying text, and any subsequent comments.
II. The Dangerous Individuals and Organizations Policy is Too Broad
The Policy was written with the stated rationale of “prevent[ing] and disrupt[ing] real-world harm.”1Dangerous Individuals and Organizations, Community Standards, Facebook, https://bit.ly/3qFuMzY; and see Barbara Ortutay, Facebook bans ‘dangerous individuals’ cited for hate speech, AP (May 3, 2019), https://bit.ly/3lTVh0O. While decisions to remove individuals may be controversial, Facebook’s use of the Policy to remove active users that Facebook has deemed harmful is discernable (e.g., Alex Jones and Louis Farrakhan).2Id. These are living individuals advancing points of view in violation of Facebook’s policies.
Extending the Policy to historical figures, however, regardless of how abhorrent their points of view, is fraught with problems. There is a clear difference between the acts of acknowledging, referencing, and promoting. Even the darkest parts of our history must be acknowledged. If a reference to a Nazi leader is made to caution against current trends, this is far from promotion.
Moreover, it is not clear what standard Facebook is using to determine whether an historical figure is a “dangerous individual” under the Policy. The Policy does not seem to allow for differentiation among notable historical figures who were responsible for both achievement and atrocities in their own time. Genghis Khan, for example, is consistently ranked as one of the world’s most influential leaders.3Joel Stein, The All-Time TIME 100 of All Time, TIME (Apr. 18, 2012), https://bit.ly/3lPHF6R However, in his time, he was responsible for the death of eleven percent of the world’s population.4Evan Andrews, 10 Things You May Not Know About Genghis Khan, History.com (Jul. 29, 2019), https://bit.ly/2Iod02G Still, a search of “Genghis Khan” returns public promotional posts from Facebook accounts like 23andMe and The British Library.5See, e.g., 23andMe, Genghis Khan genetic legacy has competition, Facebook (Jan. 26, 2015), https://bit.ly/36vpMWs); and see The British Library, Onthisday, Facebook (Aug. 18, 2015), https://bit.ly/2Vr1TsG.
Applying the Policy’s text across-the-board, using specific acts of egregious conduct as the standard for removal implicates a large number of other historical figures referenced in user posts. Some examples include Che Guevara (“multiple murderer”), Pontius Pilate (“mass murderer”), President Andrew Jackson (“human trafficker”), or Joshua of Jericho’s fame (“mass murderer”), each of whom is connected with events that fall within the broad definitions laid out by the Policy.
III. The Policy is Applied Inconsistently
Extending the Policy to historical figures is problematic enough. However, Facebook’s inconsistent application of its policies to living individuals renders the Policy ineffective and counterproductive. While Alex Jones has been banned from Facebook under the Policy, his presence on Facebook remains. Among a number of groups and pages dedicated to him, a “best of”6See, e.g. Best of Alex Jones, Facebook, (visited Dec. 7, 2020), https://bit.ly/37I8ggZ. and “meme”7See, e.g. Alex Jones Memes, Facebook, (visited Dec. 7, 2020), https://bit.ly/2VOf2fs. page both exist in violation of the Policy. The same is true for posts referencing Joseph Goebbels. A quick search of the platform returns a number of pictures of the Nazi figure as well as videos of Goebbels giving speeches.8See, e.g. Life.com, Behind the Picture: Joseph Goebbels Glares at the Camera, Geneva, 1939, Facebook (Oct. 29, 2017),
https://bit.ly/33UFjgH; and see, e.g. On this Day, Excerpt from Joseph Goebbels February 10th 1933 speech at the Berlin Sportpalast, Facebook (Feb. 10, 2020), h ttps://bit.ly/3mVnl5j. These posts also include direct references to Goebbels and members of the current US President’s staff.9See, e.g. Travelling Curmudgeon, Facebook (Feb. 12, 2017), https://bit.ly/2K4Jyiw; and see Lucid Nation, Facebook (Sept. 16, 2020), https://bit.ly/3qJZ7gP. In fact, the President-Elect has also used a misquote of Joseph Goebbels to compare the current President to the Nazi propagandist.10Evan SEMONES, ‘He’s sort of like Goebbels’: Biden compares Trump to Nazi propagandist https://politi.co/3ouhRPe.
Not only is the Policy an ineffective approach to achieving the stated policy rationale, it likely fuels speculation that the removal of content under the Policy is animated by other considerations that leave users singled out and marginalized.
IV. Retroactive Application is Unhelpful
As the Librarian of Congress noted nearly a decade ago when the Library of Congress announced it would be archiving public tweets, “The Twitter digital archive has extraordinary potential for research into our contemporary way of life.”11Twitter’s Gift: Library to Archive Groundbreaking Social Network, LOC (May 2010), https://bit.ly/3qADvTV. Our social media feeds act as a collective memory, and erasing such content risks removing important pieces of our history.
Moreover, applying policies retroactively when the harmful content has already been allowed for an extended period of time has little value in preventing harmful effects. The post on appeal had been active for two years and only received negative treatment when it was reshared. While it is true that previously posted content that is acceptable today may not be tomorrow, as policies evolve we recommend interstitials or hidden posts rather than complete takedowns.
We thank the Board for the opportunity to provide comments.