设为首页 - 加入收藏  
您的当前位置:首页 >探索 >【】 正文

【】

来源:眼花耳熱網编辑:探索时间:2024-12-22 11:35:01

Facebook's closely guarded, secret guidelines for monitoring violence, hate speech, and revenge porn have been revealed for the first time after an investigation by TheGuardian. They show the extremely fuzzy and imperfect line between what's considered dangerous material and acceptable content on the world's leading social network.

Internal documents obtained by The Guardian explore the murky waters that moderators and executives must wade through on a daily basis as they judge user generated content. A Facebook user who exclaims "Let's beat up fat kids" gets a pass, but a commenter who urges "Someone shoot Trump" is taken seriously.

The Guardian's investigation, which rolled out on Sunday, arrives at a pivotal moment for the social media giant.

SEE ALSO:How to protect yourself when social media is harming your self-esteem

Facebook faces mounting public pressure to rein in the ugliest impulses of its 2 billion users, and shield our eyes from disturbing and triggering videos, such as murders and sexual assaults that are broadcast on Facebook Live. At the same time, Facebook says it's trying to respect its users' freedom of expression and avoid overt censorship.

Mashable ImageCredit: carl court/Getty Images

Yet, like many tech companies, Facebook has kept extremely mum on the details of its content moderation strategy, even amid the deafening criticism.

TheGuardiantalked to overwhelmed moderators and said it saw more than 100 internal training manuals, spreadsheets, and flowcharts that form the blueprint for how Facebook moderates issues such as violence, hate speech, terrorism, racism, revenge porn, and self-harm.

Other stories in the British newspaper's "Facebook Files" series explores how Facebook lets users livestream self-harm videos, and how the social media site is attempting to address criticism that Facebook is a forum for misogyny and racism.

Mashable contacted Facebook for comment and will update this story with any response.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Facebook's trove of violence-related documents detail the ever-growing array of terrible situations that moderators must navigate. Many moderators have said they find the policies inconsistent and confusing, like, for example, how not all rape threats are treated equally.

"Facebook cannot keep control of its content," an unnamed source told the Guardian. "It has grown too big, too quickly."

Documents supplied to Facebook moderators within the last year included lists of comments that are considered unacceptable, and others that are allowed to remain. They include:

Credible Violence (Calls for Action):

UNACCEPTABLE: "Someone shoot Trump."

ALLOWED: "Kick a person with red hair."

ALLOWED: "To snap a bitch's neck, make sure to apply all your pressure to the middle of her throat."

ALLOWED: "Let's beat up fat kids."

UNACCEPTABLE: #stab and become the fear of the Zionist.

As a head of state, President Donald Trump is in a protected category, so threats against Trump -- even if they're hollow -- aren't allowed. Yet instructions on how to snap someone's neck aren't considered a credible threat, ostensibly because they're only hypothetical misogyny and abuse.

Credible Violence (Aspirational/Conditional Statements):

ALLOWED: "Little girl needs to keep to herself before daddy breaks her face"

ALLOWED: "You assholes better pray to God that I keep my mind intact because if I lose I will literally kill HUNDREDS of you."

ALLOWED: "Unless you stop bitching I'll have to cut your tongue out"

The leaked documents show certain "aspirational" or "conditional" statements that are permitted, because they're also considered to be generic or not credible, even if they're alarming.

Revenge Porn, Current Policy:

High-level: Revenge porn is sharing nude/near-nude photos of someone publicly or to people that they didn't want to see them in order to shame or embarrass them.

Abuse Standards: Sharing imagery as "revenge porn" if it fulfills all three conditions:

- Image produced in a private setting. AND

- Person in image is nude, near nude, or sexually active. AND

- Lack of consent confirmed by: Vengeful context (e.g. caption, comments, or page title); OR independent sources (e.g. media coverage, or LE record)

This partial excerpt comes from a slide on Facebook's revenge porn policy. Moderators told the Guardian that policies on sexual content are among the most complex and confusing.

The same goes for artwork. According to the documents, all "handmade" art showing nudity and sexual activity is allowed, but digitally made art showing sexual activity is not. A separate document showed that videos of abortion are allowed, so long as there is no nudity.

Graphic Violence (Animal Abuse):

- Generally, imagery of animal abuse can be shared on the site.

- Some extremely disturbing imagery may be "marked as disturbing."

- Sadism and celebration restrictions apply to all imagery of animal abuse.

According to another slide, Facebook's policies on animal abuse allow certain photos and videos for "awareness," although Facebook is allowed to flag content as "extremely disturbing" to warn users before they take a look.

"Generally, imagery of animal abuse can be shared on the site," one slide says. "Some extremely disturbing imagery may be marked as disturbing."

In case Facebook forgot to mention within its long list of rules, here's an easy one: Be nice to people, and stop being horrible.


Featured Video For You
What it means to have our political debates over social media

TopicsFacebookSocial Media

热门文章

    0.4445s , 10382.625 kb

    Copyright © 2024 Powered by 【】,眼花耳熱網  

    sitemap

    Top