Advertisement
Advertisement
Facebook
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Photo: AFP PHOTO/JOSH EDELSON

Facebook is hiring ‘news credibility specialists’ — after saying it didn’t want to be in the business of judging news trustworthiness

The company plans to have the specialists create a list of credible news organisations that it could make use of for various features on its site

Facebook

By Troy Wolverton

Facebook is getting back into the business of having employees make decisions about news organizations and stories — including about their credibility — something company leaders including Mark Zuckerberg have previously renounced.

The company has open job listings for two “news credibility specialists,” one of which would be required to be able to speak, read, and write fluently in Spanish. The jobs, which Facebook listed within the last few weeks, would be contract positions, rather than full-time, but would be based in Menlo Park, California, site of Facebook’s headquarters.

“We’re seeking individuals with a passion for journalism, who believe in Facebook’s mission of making the world more connected,” one of the two listings reads. It continues: “As a member of the team, you’ll be tasked with developing a deep expertise in Facebook’s News Credibility Programme. You’ll be conducting investigations against predefined policies.”

Facebook would ask the specialists to help create a list of credible news organisations. That list could be used for various features on the site, from the newsfeed to its advertising system.

The move comes as Facebook tries to battle the spread of fake news and propaganda on its social network, and to shore up its damaged reputation. But in taking a bigger role appraising news organisations, Facebook risks opening itself up to charges of bias and censorship as it tries to make sense of an increasingly fragmented media landscape.

One example of how the credibility specialists might come into play involves Facebook’s plans to ensure transparency with political ads. Facebook has said it will require political ads to disclose who paid for them, restricting ads that don’t comply.

Since news organisations often pay to promote political articles on Facebook, the credibility specialist would be able to help Facebook identify those articles from standard political ads, to ensure that the news articles don’t get caught in the net.

“We’re working to effectively identify and differentiate news and news sources across our platform,” company spokesman Adam Isserlis said.

Facebook had said it didn’t want to judge news trustworthiness

At least on its face, the fact that the social networking giant is hiring “news credibility specialists” appears to contradict a policy it announced earlier this year. Facebook officials said they didn’t want to be in the business of determining the credibility of news reports and outlets. Instead, the company said it would survey its users to figure out which sites and sources were trustworthy. It’s also said that it would rely on third-party fact-checking organisations, including the Associated Press, to judge the credibility of articles.

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” company CEO Mark Zuckerberg said in a post on his personal Facebook page in January. “We could try to make that decision ourselves, but that’s not something we’re comfortable with.”

He continued: “We decided that having the community determine which sources are broadly trusted would be most objective.”

The new news credibility specialists won’t replace the third-party fact checkers or Facebook’s surveys of users to determine the trustworthiness of sites, but will be an addition to them. But the lines between what they will do and what will be left to the survey of users could get blurry.

It makes sense for Facebook to have its own employees making decisions about the credibility of news organisations, said Daniel Kreiss, a professor of political communications at the University of North Carolina at Chapel Hill. The fact is that Facebook makes editorial decisions all the time about what it will and won’t allow on its site, and in keeping with the company’s mission, it needs to have a role in determining what content on its site is legitimate, he said.

But there are plenty of questions to be raised about this effort, Kreiss said. It’s unclear, for example, what kinds of qualifications Facebook will be looking for when trying to fill these roles; a journalist with 30 years of experience may have different ideas about what is a credible news organisation than someone just out of college, he said.

It’s also unclear what kind of criteria Facebook will use to determine what entities are news organisations and which are credible, he said. It’s anyone guess whether Breitbart would qualify or whether a news organisation run as a wing of a political organisation would qualify.

“They’re always playing this editorial role,” Kreiss said. “My concern throughout has been that they haven’t acknowledged they’re playing this role, they aren’t transparent about how they’re making those decisions, and because of that, they aren’t accountable for the decisions they do make.”

Given Facebook’s importance when it comes to distributing news and its role in the democratic process, it’s going to be crucial when the company decides that entities are not a credible news organisation that the social networking giant is transparent about its rationale and that it allows those entities to appeal the decisions, Kreiss said.

“There should be a way to ... hold Facebook accountable for the decisions they make,” he said.

we might want to truncate and move up Weiland’s comment. Maybe cut it down to just “They’re giving us a lot of mixed signals” and put it right after that part about the lines getting blurry.

The company is sending “mixed signals”

Facebook has a responsibility to be transparent about how it plans to evaluate news organisations, said Morgan Weiland, an attorney and PhD candidate at Stanford whose research focuses on how the big tech platform companies are handling their role in distributing news.

“If they’re going to build out a team like this, they need to be more explicit about how they understand their role or what kind of company they see themselves as,” Weiland said.

She continued: “They’re giving us a lot of mixed signals.”

The process of determining the credibility and newsworthiness of articles and publications has long been a fraught issue for Facebook.

Two years ago, the company drew criticism after its employees reportedly downplayed stories about right-wing politicians and conservative topics in its trending news section. In response, the company fired its human editors, relying instead on algorithms to determine trending topics. But the automated editing system ended up promoting fake and offensive stories. Earlier this month, Facebook announced it would discontinue the trending news feature.

But the company’s problems in figuring out the credibility of news go well beyond its trending news controversy.

Facebook has been trying to battle fake news

During the presidential election in 2016, Facebook’s systems were hijacked by Russian-linked groups to spread false stories and other propaganda in an alleged attempt to influence the vote, a development the company was slow to acknowledge. Facebook has also been used by ultra nationalist groups in Myanmar to spread propaganda and hate speech that has helped incite violence against the country’s Rohingya minority, according to the United Nations.

In an attempt to curtail the spread of fake news, Facebook early this year said it would de-emphasize posts from news outlets and other organisations in its news feed. Then, it said it would poll users to figure out which sites to trust. The algorithms underlying its news feed would use those determinations about trustworthiness when deciding which articles to highlight or to suppress.

Post