“Things happened on our platform in this election that should not have happened,” Facebook’s
Sheryl Sandberg acknowledged Thursday morning. “We’ve had some problems.” In recent days, the Facebook chief operating officer had embarked on a Capitol Hill charm offensive, which included a meeting with the Congressional Black Caucus and a face-to-face to smooth the ruffled feathers of the House Intelligence Committee. Now Sandberg,
Mark Zuckerberg’s No. 2 at the billion-dollar social-media behemoth, was on stage at D.C.’s Newseum for an interview with Axios’s
Mike Allen, where she was expected to provide some modicum of transparency around how and to what extent the company’s platform was used to sway the 2016 election. What Sandberg did instead, in spite of Allen’s pointed questions, was adhere to her talking points.
Asked three times, Sandberg refused to say whether Russia and
Donald Trump’s 2016 election campaign were targeting the same users, instead pivoting to discuss ad targeting on Facebook in general. “Targeting is broad. It’s used by everyone,” she said. “And I think targeting is worth talking about. Now, we’ve had some problems. We’ve had these write-in fields that never should have existed . . . There’s always the possibility for abuse.” Asked twice more about a hypothetical link, Sandberg (sort of) relented. “When the ads get released we will also be releasing the targeting for those ads,” she said. “We’re going to be fully transparent.”
She declined to specify when Facebook started investigating the thousands of ads purchased on its site by Russian operatives, saying that the company “heard rumors,” but didn’t know anything concrete until closer to the election. “If you think about 2015, 2016, the threats people were worried about were hacking into e-mail accounts,” she said. “We started to hear the rumors around the election itself of a different kind of attack,” she continued. “It wasn’t ‘I’m gonna get your e-mail, and I’m going to steal your data,’ it was more subtle—posting in an inauthentic way to try to be deceptive and divisive.”
And she was equally evasive when asked about Facebook’s so-called “filter bubbles,” which shield users from opposing viewpoints, covertly curating their feeds. Some, including
Barack Obama, have claimed that said bubbles, which encourage intellectual isolation, influenced the outcome of the election. But Sandberg brushed the question off. “It’s . . . such a common misunderstanding,” she said. “What people think incorrectly is that Facebook is what keeps you from seeing all the stuff that’s only from your friends and your point of view, so without Facebook you would see this broad range from all sides. And that’s not true. On Facebook, 23 percent of your friends have a stated ideology that’s opposite of yours.”
So Allen tried a softball: how would she define fake news? “That’s an interesting question,” she said. “There’s fake news that’s false and hoaxes, and then there’s stuff that someone thinks is wrong or is controversial but represents the other side. We know people want accurate information on Facebook, and it’s particularly important during elections, but it’s important all the time.” Asked again to define the phenomenon, she appeared frustrated. “I did define it,” she said. “Things that are false, hoaxes. But false news or fake news or misinformation is also applied to things people think is wrong. But what we’re going after is the false, the fake news, the hoaxes.”
What Facebook would not do, she said in an attempt to position the company as a benevolent defender of free speech, was restrict any type of expression. For instance, she told Allen that the site would have run
Marsha Blackburn’s Senate campaign video advertisement, which Twitter recently removed from its site for containing “an inflammatory statement.” In the video, Blackburn, a pro-life congresswoman who chaired a Republican-run panel that investigated Planned Parenthood, makes a mischaracterized claim pertaining to the sale of fetal tissue. “The question is should divisive, political, or issue ads run . . . our answer is yes, because when you cut off speech for one person, you cut off speech for other people,” Sandberg said.
It’s still unclear how the back-and-forth played with viewers, but Sandberg has plenty of experience managing crises for Facebook; she consistently serves as the company’s public, adult face, while Zuckerberg tends to make decisions behind the scenes. However, commenters on the interview’s Facebook livestream seemed unimpressed—a bad sign for a platform attempting to soothe its 2 billion-odd users in the wake of what may be its biggest public-relations failure to date. As for the Capitol Hill contingent, it’s too soon to tell whether they’ll be satisfied with Facebook’s subservient cooperation or whether their disenchantment with the former Silicon Valley darling will manifest in restrictive new regulations.