Facebook is facing hard questions following news that data-mining firm Cambridge Analytica deceived users and harvested information from 50 million profiles. But we already have the answer to one: Has Facebook learned from its mistakes? Nope.
That's probably because Facebook doesn't have a dedicated staffer — or public editor — looking out for the people who use the company's service. Consider this: Cambridge Analytica's data-scraping reportedly occurred in 2014, the same year Facebook made another fateful decision that it might not have if someone there was looking out for people like you and me.
SEE ALSO:Escaping Facebook takes more than just deleting your accountAfter touting an “Anonymous Login” feature at its 2014 F8 developers conference, Facebook quietly killed it. That tool, hyped as part of the company’s desire to put “people first,” would have let users log into other sites with Facebook credentials but without sharing their data with third-party developers. The downside to developers is as clear as the upside for users. Like Facebook itself, developers have an interest in accessing as much data as possible, while users quite rightly value their privacy and want to have control over their information.
As reported by Recode, it was specifically developers' lack of interest in Anonymous Login that led Facebook to decide to kill the feature.
But Facebook never bothered to ask the users themselves. In seeing third-party developers as the "customers" for a product that didn’t serve them, Facebook gave too much credence to the wrong segment of people. Users, for whom Anonymous Login would have been a boon, were overlooked and ignored.
There needs to be one person, invested with clout, operating outside the normal chain of command, to advocate for users and their experience and interests.
Users built Facebook and continue to make it valuable, but Facebook consistently takes them for granted and ignores their interests. In the Anonymous Login decision, the groups with clear representation in the process – Facebook itself and the developers it works with – came to a consensus that worked for them. Users, the third integral leg of that stool, were left out in the cold. And this continued disregard for users happened afterthe Cambridge Analytica crisis.
Tempting as it may seem, deleting your account won't necessarily get you out of Facebook's clutches. The company has so much information on so many millions of people that that it's better to use its vaunted network effect against it and demand inclusion in the process. There should be an advocate for users with direct access to the highest levels of Facebook: a public editor.
Calls for a public editor became more vocal after the removal of videos showing the shooting of Philando Castille and a standoff with a SWAT team in Baltimore. The company didn't have anyone in an ombudsman or "public editor role in August 2016, when scrutiny of the social network really heated up thanks to the U.S. presidential election.
We reached out to Facebook to ask why there isn't a senior person whose job is to advocate for the best interests of Facebook users as a distinct cohort. A company spokesperson said in an email that "people should decide what sort of information they are comfortable sharing with an app" and "In 2014, after hearing feedback from the Facebook community, we made an update to ensure that each person decides what information they want to share about themselves, including their friend list."
The role of public editor originated decades ago in the journalism business – you know, the one that Facebook continues to pretend it isn’t in – as a way to hold media companies accountable to their readers. Public editors were given the authority to demand access to decision makers about why certain stories were told and how decisions were made. In the wake of high-profile plagiarism scandals, these roles went some way toward restoring reader faith the in journalism institutions, faith which those companies depended on for their business models to function.
Facebook should learn from that crisis of confidence.
Andrew “Boz” Bosworth, the company’s vice president of augmented and virtual reality called the Cambridge Analytica fiasco a “breach of trust” and said that “if people aren't having a positive experience connecting with businesses and apps then it all breaks down.” The way to do this is to make sure that the interests of everyday users are continuously, actively solicited and included in every high level strategy meeting.
There needs to be one person, invested with clout, operating outside the normal chain of command, to advocate for users and their experience and interests. She or he should have a public email address and be giving regular public updates on internal decision making. Senior Facebook officials, including Mark Zuckerberg and Sheryl Sandberg should make themselves available to answer this person’s questions. Crucially, there should be no murkiness as to where this person’s loyalties lay: with users, not with Facebook the company.
In light of the news that Alex Stamos, Facebook’s head of security and internal advocate for more disclosure around Russian interference on the platform, is leaving in August 2018, this role has even more urgency. According to the New York Times, Stamos’s perspective was overruled by the legal and business interests of Facebook, which argued for prolonged silence.
“There was a perception of the news media as arrogant, paternalistic, and unresponsive. The news organizations that created positions of public editors and ombudsmen really reaped the benefit,” Esther Enkin, president of the Organization of News Ombudsman and the Canadian Broadcasting Corporation News Ombudsman, told Inverse in 2016.
Arrogant, paternalistic, and unresponsive — words that easily apply to Facebook.
Users, who make Facebook the valuable company that it is, deserve a voice and a seat at the table.
TopicsFacebookSocial Media
相关文章:
相关推荐: