admin
Pinned August 1, 2021

<> Embed

@  Email

Report

Uploaded by user
Facebook is notifying some users whose posts were removed by automation
<> Embed @  Email Report

Facebook is notifying some users whose posts were removed by automation

Facebook’s oversight board’s first judgments overturn four moderation decisions

It overturned three decisions related to hate speech and one on nudity.

Daniel Cooper
D. Cooper
January 28th, 2021
Facebook is notifying some users whose posts were removed by automation | DeviceDaily.com
Anadolu Agency via Getty Images

Facebook’s Oversight Board has issued its first five judgments on cases it selected to rule upon on December 1st. The topics cover hate speech, misinformation around COVID-19, and the right of uses to post non-sexual images of breasts without falling foul of moderation. The board has, in its initial findings, overturned four of Facebook’s moderation decisions, upheld one, and issued nine policy recommendations. 

The board overturned four decisions where Facebook had taken down a post for contravening its policies. That included case 2020-002-FB-UA, where a statement made by a Myanmar-based user asking about a lack of outrage about the treatment of Uyghur Muslims was taken down. Moderators said that the post contained language that had been interpreted as contravening the site’s policies on hate speech. The board, however, said that taken in context, while “pejorative or offensive,” did not “advocate hatred” or otherwise cause a direct incitement to violence.

Similarly, Case 2020-004-IG-UA concerned a Brazilian user who posted eight images to their Instagram as part of “Pink October.” That is a campaign in Brazil to raise awareness of breast cancer, where the images depicted the symptoms of breast cancer and how to identify them. Several of the images depicted breasts and nipples, which Facebook has effectively outlawed as part of its nudity policy. 

When this case was selected for analysis, Facebook subsequently reversed its decision and asked the board to dismiss it. The board, however, said that it would continue to hear the appeal and said that a “lack of proper human oversight” raises “human rights concerns” with the site’s automatic moderation. It added that the site should be clearer about its use of automated enforcement and revise its community guidelines to better explain the exception made for breast cancer images.

On a call to discuss the judgments, director of oversight board administration Thomas Hughes and co-chair Michael McConnell both said that the gap between Facebook’s public-facing community standards and its private moderation guidelines was too broad. Hughes said “users need more clarity and more precision from the community standards.” McConnell, meanwhile, said that Facebook’s decision-making process was “a bit opaque.”

One thing that is likely to change in future is a requirement for users to be clear about their intention when posting certain images. Case 2020-005-FB-UA featured a person publishing an image of Nazi Joseph Goebbels with an attributed quote describing how appeals to emotion, rather than logic, makes effective political communications. Facebook took the image down, citing the promotion of a dangerous organization, but the poster said the image was uploaded to mirror the similarities between the political discourse then and now. 

The board said that Facebook’s rules needed to be clearer about its list of “dangerous individuals” and to better inform users “how to make their intent clear when discussing dangerous individuals and organizations.” 

On December 1st, the board outlined that it would select cases based on their potential to “affect lots of users around the world.” Other factors it would take into account included “critical importance to public discourse” and to “raise important questions about Facebook’s policies.” Each instance would be ruled upon by a panel of five members, one of whom had to be from the “region implicated in the content.” When a ruling is made, the board passes it to Facebook, which has seven days to act upon the binding resolution, and respond to the board’s policy recommendations within the next 30 days. 

The first case to be announced by the oversight board, coded 2020-001-FB-UA, was actually withdrawn from consideration on December 3rd, 2020. In a statement, the Board said that the original post, a comment beneath which had been removed for violating Facebook’s hate speech rules, had been deleted. Because neither entry was still available on the platform, the Board opted to choose a new case.

The case that replaced the first one, 2020-007-FB-FBR, relates to an image depicting a man holding a sheathed sword and a reference to drawing the sword “in response to ‘infidels’ criticizing the prophet.” The text beneath referenced French president Emmanuel Macron, who is proposing laws to curb what he describes as “Islamist separatism.” The post was withdrawn based on Facebook’s policy of incitement to violence, suggesting that it was an implied threat towards the president. This case has not, however, been dealt with today, and is expected to be published in the coming days.

Before the rulings were published, a group calling itself The Real Facebook Oversight Board decried today’s judgments. The group, counts academics, researchers and civil rights leaders including the CEO of the Anti-Defamation League and president of the NAACP amongst its members. Other notable names associated with the Real Oversight Board include early Facebook investor (and subsequent critic) Roger McNamee and Yael Eisenstat, who previously led Facebook’s election integrity efforts. 

In a statement, The Real Oversight Board said that the rulings were a “distraction from real, independent accountability,” and called the body a “PR effort” and “oversight theatre.” It added that Facebook repeatedly fails to tackle hate speech and disinformation across its platforms, enabling right wing domestic terrorists to organize themselves in the wake of the January 6th attack on the Capitol building. 

It added that the Board’s make-up, of “hand-picked experts, paid six-figures each, ruling on a limited set of harms in a non-transparent manner” is hardly democratic. It says that the lack of true independence, private hearings and long lag time between action and judgment are all signs that the system is unfit for purpose. Not to mention that the board can only examine specific cases, rather than examining “deep systemic flaws that allow harmful content to pervade Facebook’s sites[sic].”

Shortly after the board’s judgments were published, Facebook VP of Content Policy Monika Bickert issued a response. She singled out one case — where a French user had a post taken down due to COVID-19 misinformation — since it was critical of France’s health policy. Bickert said that the ruling has highlighted concerns that Facebook should be “more transparent about our COVID-19 misinformation policies.” She added that revised guidelines will be published soon, and says that it’s important that clear guidance is available. 

Some of the policy recommendations will take longer than 30 days to implement, including the suggestion that automated moderation appeals be dealt with by a human operative. It will also work to tackle “identical content with parallel context” — similar statements made in similar circumstances — which can now be reinstated or withdrawn based on the decisions published today. 

The board also made it clear that we will not hear about the appeal made by the Trump campaign to overturn its account suspension for some time. On January 6th, CEO Mark Zuckerberg said that the dangers of allowing Trump access to his account would likely “provoke further violence.” The board accepted the appeal on January 21st, and board co-chair Michael McConnell said that the Trump campaign had yet to submit a written statement in support of its appeal.

 

https://www.engadget.com/ts own rolechangespolicypreviously saidfirst casesnew report

(31)


Top