Header Ads

Twitter and Facebook have very different ideas about “fake news.” One of them is terribly wrong.


On Friday, Twitter published an update on their review of election-related activity. Here’s the short version: Over the course of its investigation, the company identified 1,062 additional accounts associated with the notorious Russian troll farm known as the Internet Research Agency, bringing the total number of IRA-linked accounts up to 3,814. During the ten weeks leading up to the 2016 election, these accounts posted 175,993 tweets, roughly 8.4 percent of which were election-related. Twitter also identified 13,512 Russia-based bot accounts, for a total of 50,258. In all (again, according to Twitter), 677,775 users in the United States interacted with one of these accounts during election season.



As juicy and quotable as these figures are, they’re not exactly news. We’ve known there was a pretty significant amount of malicious Russia-backed activity on Twitter for a while now, and while attaching concrete numbers certainly makes for a better sound bite, it doesn’t change the reality of the situation: Fake news was systematically spread on social media in an attempt to influence the 2016 election — and it did just that.

What’s interesting about Twitter’s announcement aren’t the numbers or percentages or tallies, but rather what it says about how the company views itself. Twitter could have easily followed in Facebook’s footsteps and published a scarce amount of data on the role its platform played in the 2016 election. The company could have chosen to hide the most disturbingly concrete details deep in some congressional testimony, or in an obscure corner of some awfully designed website. But it didn’t.
After months of statements from Facebook so awkwardly-worded they seem like they were written by the world’s laziest neural network, “Update on Twitter’s Review of the 2016 U.S. Election” reads like poetry. It’s human and surprisingly apologetic. Even when delivering sobering statistics (like that the 50,258 identified Russian bots were definitely tweeting election-related content), Twitter’s researchers don’t fall back on weak qualifiers and legalese-padded excuses. They admit they fucked up, plain and simple.

What’s more, they acknowledge that fixing this mess isn’t something they can do alone. And this isn’t just spin. The statement goes on to detail their ongoing partnership with journalistic NGOs and a number of nonprofit organizations dedicated to improving media literacy. They realistically describe the issue they’re facing as something that’ll never be completely solvable, but publicly commit themselves to continuing to try.

Compare this to Facebook’s approach. In a post numbingly entitled “Helping Ensure News on Facebook Is From Trusted Sources,” the company begrudgingly acknowledges the existence a fake news problem, then immediately washes its hands of it by deflecting responsibility for the quality of information on its network back to the users.

“The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division,” writes Zuckerberg. “We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem...We decided that having the community determine which sources are broadly trusted would be most objective.”
Translation: We understand that there’s an issue here — and we totally could ask someone else for help — but we’re going to use a faux-democratic process instead, because, c’mon, when has the “wisdom of the crowd” ever been a problem for us? Oh wait.

Facebook doesn’t want help from partners and experts. Because asking for assistance requires you to admit that you alone are not enough. That you’re not the smartest person in the room. That you got something seriously wrong. And that’s obviously not going to happen for the company that’s become an increasingly scary monopoly.

It’s the same pigheadedness that made Zuck turn a blind eye to the platform’s obvious issues during (and immediately after) the election. He’s so sure of himself, he can’t even pretend to see the problem. So they’re stuck doubling down on their bullshit — shifting the blame to anyone and everyone else.