Monday, 28 May 2018
Latest news
Main » Facebook to reveal rules for removing controversial posts

Facebook to reveal rules for removing controversial posts

26 April 2018

Facebook hopes to receive feedback to improve the guidelines. While the guidelines themselves are not necessarily new, this is the first time they have been made public, giving users the opportunity to scrutinize how Facebook polices content. It was that reluctance, in retrospect, that led Zuckerberg to neglect an integral part of growing an online community, and led Facebook to stumble continuously when it comes to privacy standards, precipitating the type of suspicion that spurred Ted Cruz to spend his allotted questioning period demanding to know why Facebook had removed an appreciation-day page for Chick-fil-A.

In addition to publishing its full guidelines for what is and isn't allowed on the platform, Facebook has also unveiled an appeals process for anyone who has had content removed due to a violation. First, to help people understand where the social network draws the line on nuanced issues.

The guidelines encompass dozens of topics including hate speech, violent imagery, misrepresentation, terrorist propaganda, and disinformation.

Here Facebook lays out what it can do to help various users.

Ford to Discontinue Almost All its Cars in North America
For comparison, Ford's light truck sales rose 7.2 percent, while GM's light truck sales jumped 26.1 percent. But the company's profit margin slipped to 5.2 percent from 6.4 percent a year earlier.

Some of the examples seen in the rules include Facebook not allowing videos of people who have been wounded by cannibalism, and not permitting attempts to buy and sell marijuana, prescription drugs and firearms on the website. That's why, over the coming year, we are going to build out the ability for people to appeal our decisions.

Users across the world will no longer be blinded on why a certain post has been removed or blocked on Facebook. (The photo was restored after protests from news organizations.) Moderators have deleted posts from activists and journalists in Burma and in disputed areas such as the Palestinian territories and Kashmir and have told pro-Trump activists Diamond and Silk they were "unsafe to the community". Bickert said the company plans to bring this to New Delhi in June or July this year. They have ballooned from a single page in 2008 to 27 pages today.

Facebook Inc. said it was able to remove a larger amount of content from the Islamic State and al-Qaeda in the first quarter of 2018 by actively looking for it. Appeals will be sent to a human moderator who will issue a decision within 24 hours.

Activists and users have been particularly frustrated by the absence of an appeals process when their posts are taken down.

Sick prisoners in Nome tied to E.coli concerns in Arizona lettuce
The CDC is warning everyone not to buy or eat romaine lettuce unless you can confirm it is not from the Yuma Arizona region. The Centers for Disease Control is now expanding its warning about the E. coli outbreak connected to romaine lettuce .

Still reeling in the aftermath of the Cambridge Analytica data harvesting revelations, Facebook is clearly acting fast to sustain dwindling user trust by spelling out policies loud and clear. If your post is taken down, you'll be notified on Facebook with an option to "request review".

"We believe giving people a voice in the process is another essential component of building a fair system", Bickert wrote.

Another challenge is accurately applying our policies to the content that has been flagged to us.

Asked if the publicisation of the policy will bring down the reporting of posts or appeals, Bickert said that was a possibility, though the objective was for the people to have clarity.

Falling Short of Target, Donors Pledge $4.4 Billion for Syrian Civilians
Some 450,000 people have been killed in Syria since President Bashar al-Assad's 2011 crackdown on protesters calling for his ouster.

While Facebook has provided a watered down version of the guidelines moderators use to police content, it hasn't ever given users this level of transparency. The company reportedly received information about users through a personality app developed by Alexander Kogan, a Cambridge University researcher, and went on to use it to help predict and influence U.S. voters. Facebook goes into a lot of detail about its efforts to keep Facebook a safe environment.

Facebook to reveal rules for removing controversial posts