设为首页 - 加入收藏  
您的当前位置:首页 >休閑 >【】 正文

【】

来源:眼花耳熱網编辑:休閑时间:2024-12-22 15:56:45

Facebook has been cracking down on fake news, but it's been missing one very important place of conversation.

Messenger, the standalone app for Facebook's private messaging feature, is not being included in the company's fact-checking program.

SEE ALSO:Facebook Messenger just made sharing photos way better

In March, Facebook recruited a team of third-party fact-checkers to flag fake news articles that were being shared on its network. It was part of an increased effort to limit the reach of misinformation on the site in wake of Russian interference in the 2016 presidential election.

Fake news articles now receive a disputed tag when they are posted to the Facebook News Feed. But that's not the case if a user sends the same article to an individual or group through Messenger.

Mashable Games

However, one Facebook user claims they recently experienced the opposite. After sharing a Breitbart story to someone via Messenger, the person says they received a notification that read, "A link you shared contains info disputed by Politifact," meaning they were sharing a fake news story.

Mashable ImageCredit: screenshot

And yet, the individual told Mashablethey had not posted or even tried to share the article to their News Feed. (The individual requested anonymity due to the polarizing nature of the story.)

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Mashable reached out to Facebook for comment, and we were told by a company spokesperson that the person must have experienced a bug. If true, this is a case where a bug has revealed an inherent flaw in Facebook's fact-checking system.

If Facebook was actually committed to curbing fake news on its platform, it would address the problem in all areas of the site, not only the Facebook News Feed. According to the company, more than 1.3 billion people use Messenger every month, and we know at least some fake news articles have been shared on it.

One reason the company may not scan private Messenger conversations for fake news could be that it doesn't want to appear to be "creepy." For example, Facebook drew criticism when it initially announced it would use WhatsApp data to inform the company's expansive ad network (even though it wasn't pulling information from private messages). Facebook has repeatedly insisted it does not scan private conversations for advertising.

For the most part, the private areas of Facebook's products, which include services like Messenger and WhatsApp, remain untouched. An exception is that Facebook uses automated tools like PhotoDNA to scan for child exploitation images shared within Messenger. But there is no system currently used to detect fake news within Messenger or WhatsApp.

One of Facebook's fact-checking partners, Poynter, recently explored how fighting fake news on WhatsApp remained difficult due to the closed nature of the network.

“WhatsApp was designed to keep people’s information secure and private, so no one is able to access the contents of people’s messages,” said WhatsApp's policy communications lead Carl Woog in an email to Poynter. “We recognize that there is a false news challenge, and we’re thinking through ways we can continue to keep WhatsApp safe."

Of course, that could change in the future.

A Facebook spokesperson told Mashable that the company is working on new and increasingly more effective ways to fight false news stories on all of its apps and services. But until then, it appears that Facebook users will have to do their own fact-checking on Messenger.


Featured Video For You
Is this the most dog-friendly car ever?

TopicsFacebookSocial Media

热门文章

    0.4816s , 10324.203125 kb

    Copyright © 2024 Powered by 【】,眼花耳熱網  

    sitemap

    Top