Facebook’s day was consumed with the fallout from Wednesday’s New York Times story about its slow response to Russian interference, which generated a furor greater than anything the company has seen since the Cambridge Analytica data privacy scandal. The company answered its critics, put Mark Zuckerberg on the phone with reporters for an extended question-and-answer session, and moved to shift the conversation to some important moves it is making around content moderation. It’s not yet clear whether the moves will cause the outrage to subside — or whether, as happened with Cambridge Analytica, it will metastasize over the coming days and weeks.
Let’s take a look at the day’s most important developments, in chronological order.
First, Facebook responded to the Times in a point-by-point rebuttal. You can read the blog post here. The company’s main objection to the Times piece is the suggestion that it sought to downplay or cover up Russian interference on the platform before the election. Facebook also says no one discouraged its chief security officer, Alex Stamos, from investigating the Russia problem. (It did not dispute the story’s assertion that Sheryl Sandberg, the chief operating officer, had criticized Stamos for going somewhat rogue with his investigation and possibly leaving the company exposed legally.) The board had Zuckerberg’s back, issuing a statement touting the company’s progress in fighting misinformation.
Second, Facebook held a press call to discuss its second community guidelines enforcement report. The report, which is new as of this year, measures the amount of content policing that Facebook does across its network. It now plans to release such a report quarterly; you can read the new one here; or read Adi Robertson’s helpful gloss here. Big takeaways: governments continue to ask Facebook to take down more and more information; Facebook is reporting levels of bullying and harassment and child exploitation for the first time; and the company deleted 1.5 billion fake accounts in the past six months.
Third, Mark Zuckerberg posted a 4,500-word “blueprint” on the future of content moderation on Facebook. You can read that post here. The post had at least two highly consequential announcements. One, Facebook will once again move to reduce sensationalist content from the News Feed. What struck me was the language Zuckerberg used to discuss this issue — it’s different than anything he has said before. And it goes to the heart of social networks’ role in creating a polarized, destabilized electorate:
One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content. This is not a new phenomenon. It is widespread on cable news today and has been a staple of tabloids for more than a century. At scale it can undermine the quality of public discourse and lead to polarization. In our case, it can also degrade the quality of our services.
Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content.
Zuckerberg says Facebook will “train AI systems to detect borderline content so we can distribute that content less.” It remains to be seen how effective AI will be at that task, or what tradeoffs are involved. But it may be among the most important things Facebook does in the next year.
The other major announcement: an independent oversight body to review appeals for content removals. Zuckerberg first discussed the idea of a “Facebook Supreme Court” with Ezra Klein in April; I wrote about why such a body was necessary in this space in August, during the Alex Jones imbroglio. I asked Zuckerberg today whether he thought the body should publish its opinions, creating a kind of case law; he told me that he did. The body won’t be up and running until the end of 2019 at the earliest, but when it arrives we can expect a growing body of social network jurisprudence, and it’s going to be fascinating to watch.
Fourth, Zuckerberg answered questions about the most damning elements of the Times’ report. Reporters focused on the company’s decision to hire Definers Public Affairs, a Washington, DC-based public relations and opposition research firm. Facebook fired the firm Wednesday night. Zuckerberg said that neither he nor Sandberg knew that Definers even worked for them. This strained credulity, as Facebook’s own blog post noted that “our relationship with Definers was well known by the media — not least because they have on several occasions sent out invitations to hundreds of journalists about important press calls on our behalf.” (I had item about Definers here in February.)
The Definers issue went nuclear for two reasons. One, the company circulated a document that attempted to link criticism of Facebook — wrongly, it turns out! — to George Soros. Linking events to Soros, a liberal philanthropist who escaped the Holocaust, is a well-worn tactic of anti-Semites. And Zuckerberg and Sandberg, of course, are Jewish.
Two, Definers employs what one former employee told NBC News was “an in-house fake news shop” to push messages into the broader media ecosystem. Michael Cappetta, Ben Collins and Jo Ling Kent report:
Definers runs a website called NTK Network, which has a verified page on Facebook with more than 120,000 followers that publishes and promotes articles about the firm’s clients as well as their competitors.
A former employee of Definers, who asked not to be identified in order to protect professional relationships, told NBC News that NTK Network was “our in-house fake news shop.” Some clients would actively pay for NTK Network’s positive coverage, which the ex-employee said would then be pushed out through Facebook in the hopes of being picked up by larger conservative media outlets such as Breitbart.
And indeed, the Times found that NTK pushed dozens of pro-Facebook and anti-Facebook competitor messages during its time of employment with the company, some of which were picked up by Breitbart. For a company that has spoken loudly and often over the past year about its commitment to reduce the spread of misinformation, the fact that it had hired a crisis communications agency to actively spread misinformation was hypocrisy of the rankest sort. Definers had to go.
Zuckerberg suggested this was some sort of rogue operation:
“We certainly never asked them to spread anything that is not true. That’s not how we want to operate. In general, I think a lot of DC-type firms might do this kind of work. I understand why other companies might want to work with them, but that’s not the way I want to run this company.”
It’s a line that would have been more credible had Zuckerberg not run the same play in the past. In 2011, it hired Burston-Marsteller to write scaremongering stories about Google privacy policies. (Microsoft had hired it to do the same thing.) Incredibly, Facebook got away with a “no comment” at the time.
The common thread in both episodes, beyond Facebook’s CEO and COO, is the company’s now-former head of communications, who would have been responsible for both: Elliot Schrage. Schrage stepped down in June. The next time he sits down with a reporter, I hope he’ll be asked about how he views the role of companies like Definers and Burston-Marsteller in promoting a company’s interests.
Fifth and finally, everyone is mad. George Soros called for an investigation. Sen. Richard Blumenthal is mad. Sen. Mark Warner is mad. Sen. Ben Sasse is mad. Sen. Ron Wyden is mad. The comptroller of New York is mad. Sen. Amy Klobuchar, who the Times story suggested had eased up on her criticisms after being personally lobbied by Sandberg, said she planned to ask the Justice Department to investigate potential violations of campaign finance laws.
In Silicon Valley, Kurt Wagner wonders who is going to be fired over this. (Zuckerberg was asked this question several times on the call, and demurred, other than to say it won’t be Sandberg.) Berkeley students say they won’t even consider working for Facebook. Alex Stamos is mad at the mass media for failing to examine its own role in the Russia story.
But I’ll end where I started: this particular Facebook scandal has gotten the attention of regular people. It’s the sort of scandal that has led friends from high school and college to text me asking what’s going on. Three of them in recent days have either deleted or deactivated their Facebook accounts. After two years of final straws, the events of this week have offered them another. “Facebook just filled with crazy idiots now,” The Onion said.
The headline was the whole story, in both senses of the phrase.
Anti-immigrant sentiment is spreading through social media and spilling onto the streets of Tijuana as the caravan makes its final approach to the border, Karla Zabludovsky reports:
For weeks, the growing presence of troops on the US border had worried members of the migrant caravan, which became a major rallying cry for President Donald Trump’s nationalist base and an unprecedented diplomatic and logistical challenge for the Mexican government. Migrants face a new threat: residents of Tijuana — the final stop on their 2,700-mile-long journey — who are organizing protests against the caravan and threatening them, or anyone who supports them, with violence.
Several Facebook and WhatsApp groups advocating for the caravan’s deportation have sprung up in the month since the migrants set out from Honduras, underscoring escalating anti-immigrant sentiment in northern Mexico. The violent language used against Central Americans in these groups echoes that used by Trump supporters in the US, referring to the caravan as an “invasion” and issuing a call to arms in defense of borders.
The BBC has a lengthy investigation into Facebook’s role in Nigeria. Nigerian police tell the outlet that false information and incendiary images on Facebook contributed to more than a dozen recent killings in Plateau State, which has recently seen a spike in ethnic violence.
The truth didn’t matter. The images landed in the Facebook feeds of young Berom men in the city of Jos, hours to the north of the rural district where the massacre was happening. Some of the Facebook posts suggested that the killings were happening right there in Jos, or that the inhabitants of the city were about to be attacked. Few stopped to question the claims, or to check the origin of the graphic pictures that were spreading from phone to phone.
“As soon as we saw those images, we wanted to just strangle any Fulani man standing next to us,” one Berom youth leader told the BBC. “Who would not, if they saw their brother being killed?”
Before today’s announcement of an independent advisory body, the Electronic Frontier Foundation, Human Rights Watch, and more than 70 other groups asked Mark Zuckerberg to adopt a clearer “due process” system for content takedowns. Adi Robertson reports:
“While Facebook is under enormous — and still mounting — pressure to remove material that is truly threatening, without transparency, fairness, and processes to identify and correct mistakes, Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform,” said the EFF in a press release. Other signatories include Article 19, the Center for Democracy and Technology, Ranking Digital Rights, PEN America and Canada, and the American Civil Liberties Union.
Ben Casselman examines the subsidies that have been promised to Amazon’s new regional office locations, noting that other nearby locations promised much more:
Indeed, in selecting New York and Virginia for its new locations, Amazon turned down seemingly richer offers just next door. Maryland and New Jersey each offered multibillion-dollar incentive packages that dwarfed the ones Amazon accepted.
“An additional $7.5 billion in subsidies wasn’t enough to get Amazon to move across the river,” said Michael Farren, an economist at the Mercatus Center, a libertarian think tank, referring to the difference between Maryland’s offer of $8.5 billion and Virginia’s of less than $1 billion. “That just says that subsidies were never what mattered in the first place.”
Chechen leader Ramzan Kadyrov used his brief time back on his favorite app to post a tribute to his gun, Hayes Brown reports. He was previously banned from the app due to US sanctions. Kadyrov’s post read in part:
PISTOL. How much I have to tell you, my friend. As if in this silence only you and me. How many difficult years have we lived with you? How many accurate lead ‘words’ you said to enemies and villains defending my honor, dignity, and life. You became my brother. Devoted and silent. Faithful and selfless.
There are currently too many ongoing crises at Facebook for General Counsel Colin Stretch to quit as planned, Kurt Wagner reports. He’s now planning to stay until 2019.
TikTok is having a moment in the United States; celebrities including Jimmy Fallon and Tony Hawk have recently joined, Julia Alexander reports.
Renee DiResta looks at the fascinating ways that online communities resemble cults:
The idea that “more speech” will counter these ideas fundamentally misunderstands the dynamic of these online spaces: Everyone else in the group is also part of the true believer community. Information does not need to travel very far to reach every member of the group. What’s shared conforms to the alignment of all of the members, which reinforces the group’s worldview. Inside Cult 2.0, dissent is likely to be met with hostility, doxing, and harassment. There is no counterspeech. There is no one in there who’s going to report radicalization to the Trust and Safety mods.
Fox News hasn’t tweeted for a week in protest of Twitter’s failure to swiftly take down tweets that posted Tucker Carlson’s address. This is a very interesting experiment and here’s hoping Fox News continues it for a very long period of time!
Facebook has filed a patent that would make it easier to target whole families with ads by analyzing the photos they post, Adi Robertson reports, using machine learning and cross-referencing it with other device and post data. Not every patent turns into a product, but the timing still doesn’t feel great.
Tap the Twitter search button and you’ll now see a variety of sections that you can browse.
Three months after it was said to be “rolling out,” the Instagram dashboard has now appeared. I’ve been average about 7 minutes a day.
Ahead of the launch of its standalone shopping app, Instagram is adding more features designed to turn the app into a catalog.
Here is the extremely rare day that the Wall Street Journal and New York Times op-ed page have the exact same opinion. The Journal, like the National Review yesterday, finds itself in the even more unusual position of agreeing with liberal Congresswoman Alexandria Ocasio-Cortez:
We rarely agree with socialist Congresswoman-elect Alexandria Ocasio-Cortez, but she’s right to call billions of dollars in taxpayer subsidies for Amazon “extremely concerning.” These handouts to one of the richest companies in the history of the world, with an essentially zero cost of capital, is crony capitalism at its worst.
And here’s the Times:
It’s distressing that a mayor and governor who can’t come together for the sake of the subways or public housing somehow managed to find common ground by doing an end run around the City Council and steamrollering the land-use process.
We won’t know for 10 years whether the promised 25,000 jobs will materialize. We do know that for decades states and cities have paid ransoms in the tens of billions of dollars to attract or “keep” jobs only to find themselves at the losing end of the proposition when companies moved on after the taxpayer freebies ended.
And finally …
My most sincere condolences to BuzzFeed reporter Ryan Mac, who had hoped to ask Zuckerberg about this (incredibly timed) Kanye tweet from Wednesday evening. Zuckerberg extended the questioning period twice on Thursday, to his great credit, but sadly Mac was never called on.
Ain’t nothin’ but a heartache.
Talk to me
Send me tips, comments, questions, and your nominations to Facebook’s independent oversight body: email@example.com.