Mastodon has gained popularity over the past year as Twitter users looked for alternatives following Elon Musk’s takeover. Part of its appeal is its decentralized nature that insulates it against the whims of billionaires who speak before they think. Unsurprisingly, though, what makes it so appealing has also proven to be a headache, making content moderation all but impossible.

A study from Stanford found 112 matches of known child sexual abuse material (CSAM) over a two-day period, with almost 2,000 posts using common hashtags related to abusive material. Researcher David Thiel says, “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close.” We’ve reached out to Mastodon for comment and will update this story once we’ve heard back.