Last week, Meta apologized after blocking links by a nonprofit newspaper and an independent journalist who published a report that criticized Facebook and accused it of suppressing posts related to climate change. Meta denied that it was censoring content and blamed an unspecified “security issue.”
Every single link — about 6,000 stories — that the Kansas Reflector had ever posted to Facebook disappeared from the platform on Thursday. For seven hours, anyone trying to post a Reflector link was met with a warning that the the site posed a security risk.
That’s seven hours during which the Reflector staff had no idea why Meta — a tech behemoth no major publisher can ignore, given its grip on the most world’s most popular social sites — had blown up not only years worth of digital labor but also dinged the credibility of the local paper, whose audience was told, erroneously, that its links contained potential malware.
By the end of the day, almost all of the Reflector’s links had come back online, save for one: an opinion piece that criticized Facebook’s policies around paid promotions.
To test theory that that the Reflector’s domain had some kind of security issue, a Brooklyn-based journalist, Marisa Kabas, asked for permission to republish the text of that column on her own website.
But sure enough, when Kabas posted her own link to the column on Threads, Meta flagged it as malicious content and took it down. Then Meta nuked everything her website had ever published on its platforms, a block that lasted at least two hours, Kabas told CNN.
Meta didn’t respond CNN’s request for more information about the security issue. The editor-in-chief of the Kansas Reflector, Sherman Smith, wrote on Friday that Facebook spokesperson Andy Stone “wouldn’t elaborate on how the mistake happened and said there would be no further explanation.”
“What was the security error? We don’t know,” Kabas wrote in her recounting of the situation. “What caused the links to be blocked? We don’t know.” And while all of the links have been restored “our trust has been undermined at a time when people need little reason to distrust the news.”
The Meta-verse
The debacle helps illustrate one of the more pernicious issues of our Extremely Online Era that’s often glossed over in a morass of boring regulatory jargon: the concentration of power in social media.
One of the ironies of the situation is that the first public statement from Meta came Thursday evening on X, site formerly known as Twitter. Naturally, Kabas and the Reflector staff had to move their complaints to one of the few public platforms not operated by Meta, which left mostly X and its smaller rival Bluesky.
“Anyone involved this past week now understands that putting our civic conversation into the hands of a single for-profit business generates profound risks for society as a whole,” wrote Clay Wirestone, the Reflector’s opinion editor.
Of course, Meta is regularly accused of censoring content by people across the political spectrum — people who often misunderstand that Meta is a business and not the Free Speech Police. The difference here is that Meta acknowledged it made a mistake and ultimately fixed it, albeit in a frustratingly opaque way that left content creators with a lot of questions.
Meta controls a whole bunch of the social media ecosystem, and that means there aren’t a lot of competitors to keep it honest. Meanwhile, all of media relies on it because content creators have to get in front of readers if they want to survive.
With nearly 4 billion monthly active users on its platforms — Facebook alone accounts for 3 billion — it’s not hard to see why some folks want to break Meta up, or at the very least create stronger regulations keep it from elbowing everyone else out the market.
Of course, others argue that breaking up Meta wouldn’t necessarily solve the biggest problems with social media — namely that it perpetuates misinformation at an unprecedented speed and scope, wrecks teens’ mental health and creates toxic echo chambers that undermine the promise of democracy. And if you bust up Meta, there may be no stopping another tech giant from filling the void.
We don’t know what happened within Meta to trigger an erroneous block of legitimate news sources last week. But what we do know is that the company’s control over what we see online can have profound effects on the real world. When Meta decides to dramatically reduce referral traffic to media outlets, as it did last year, there’s little anyone outside of Meta can do to push back.