Musk’s ‘Priority #1’ Disaster: CSAM Problem Worsens While ExTwitter Stiffs Detection Provider
7 mins read

Musk’s ‘Priority #1’ Disaster: CSAM Problem Worsens While ExTwitter Stiffs Detection Provider


from the not-such-a-priority-apparently dept

One of Elon Musk’s first “promises” upon taking over Twitter was that fighting child exploitation was “priority #1.”

He falsely implied that the former management didn’t take the issue seriously (they did) and insisted that he would make sure it was a solved problem on the platform he now owned. Of course, while he was saying this, he was also firing most of the team that worked on preventing the sharing of child sexual abuse material (CSAM) on the site. Almost every expert in the field noted that it seemed clear that Elon was almost certainly making the problem worse, not better. Some early research supported this, showing that the company was now leaving up a ton of known CSAM (the easiest kind to find and block through photo-matching tools).

A few months later, Elon’s supposed commitment to stomping out CSAM was proven laughable when he apparently personally stepped in to reinstate the account of a mindless conspiracy theorist who had posted a horrific CSAM image.

A new NBC News investigation now reveals just how spectacularly Musk has failed at his self-proclaimed “priority #1.” Not only has the CSAM problem on ExTwitter exploded beyond previous levels, but the company has now been cut off by Thorn—one of the most important providers of CSAM detection —after ExTwitter simply stopped paying its bills.

At the same time, Thorn, a California-based nonprofit organization that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X.

Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. X said Wednesday that it was moving toward using its own technology to address the spread of child abuse material.

Let’s pause on this corporate-speak for a moment. ExTwitter claims it’s “moving toward using its own technology” to fight CSAM. That’s a fancy way of saying they fired the experts and plan to wing it with some other—likely Grok-powered— nonsense they can cobble together.

Now, to be fair, some platforms do develop effective in-house CSAM detection tools and while Thorn’s tools are widely used, some platforms have complained that the tools are limited. But these types of systems generally work best when operated by specialized third parties who can aggregate data across multiple platforms—exactly what organizations like Thorn (and Microsoft’s PhotoDNA) provide. The idea that a company currently failing to pay its bills to anti-CSAM specialists is simultaneously building superior replacement technology is, shall we say, optimistic.

The reality on the ground tells a very different story than Musk’s PR spin:

The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material.

Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was “a bit old hat” at this point, and that X’s response “has been woefully insufficient.” “It seems to be a little bit of a game of Whac-A-Mole that goes on,” he said. “There doesn’t seem to be a particular push to really get to the root cause of the issue.”

NBC’s investigation found that Musk’s “priority #1” has become a free-for-all:

A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute.

Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today.

That seems bad! Read it again: hashtags that were flagged as CSAM distribution channels in 2023 are still active and being used for the same purpose today. This isn’t the kind of mistake that happens when you’re overwhelmed by scale—this is what happens when you simply don’t give a shit.

Look, I’m usually willing to defend platforms against unfair criticism about content moderation. The scale makes perfection impossible, and edge cases are genuinely hard. But this isn’t about edge cases or the occasional mistake—this is about leaving up known, previously identified CSAM distribution channels. That’s not a content moderation failure; that’s a policy failure.

As the article also notes, ExTwitter tried to get praised for all the work it was doing with Thorn, in an effort to show how strongly it was fighting CSAM. This post from just last year looks absolutely ridiculous now that they stopped paying Thorn and the org had to cut them off.

But the real kicker comes from Thorn itself, which essentially confirms that ExTwitter was more interested in the PR value of their partnership than actually using the technology:

Pailes Halai, Thorn’s senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn’s software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn’t clear if they ever fully implemented it.

“They took part in the beta with us last year,” he said. “So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it’s not very clear to us at this point how and if they used it.”

So there you have it: ExTwitter signed up for anti-CSAM tools, used the partnership for good PR, then perhaps never bothered to fully implement the system, and finally stopped paying the bills entirely.

This is what “priority #1” looks like in Elon Musk’s world: lots of performative tweets, followed by firing the experts, cutting off the specialized tools, and letting the problem explode while pretending you’re building something better. I’m sure like “full self-driving” and Starships that don’t explode, the tech will be fully deployed any day now.

Filed Under: child safety, csam, elon musk, prevention

Companies: thorn, twitter, x



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *