Leftist Social Media Platform Mastodon Has Child Abuse Problem, Says Researchers  – One America News Network


(Photo by JOEL SAGET / AFP) (Photo by JOEL SAGET/AFP via Getty Images)

OAN’s Shawntel Smith-Hill
4:34 PM – Monday, July 31, 2023

Mastodon, a left-leaning social media platform, reportedly has a massive child sexual abuse material (CSAM) problem, according to a July report from Stanford researchers.

Advertisement

The platform has gained immense popularity in the last few months as an alternative to other free speech sites, like Twitter.

Enthusiasm for the new platform exploded back in 2017 and has seen a resurgence in recent months, particularly among left-leaning individuals who have been unhappy with Elon Musk’s recent changes to Twitter, now called “X.”

Reports from TechCrunch indicated Mastodon has seen nearly 2.1 million new users join in the past couple months.

Researchers at the Stanford Internet Observatory conducted 2-day tests and found over 600 pieces of confirmed or suspected child abuse material throughout Mastodon’s algorithm.

In a review of 325,000 posts and in only five minutes, researchers reportedly found 2,000 examples of CSAM-related hashtags. Researchers reported that a vast amount of the flagged material was reportedly circulating through Japanese networks, which have been known to have “significantly more lax laws” in regards to their content.

CSAM content on the platform has been linked to the fact that, unlike most centralized platforms that control moderation of content on a large-scale network that users are connected to, Mastodon works as a decentralized social media platform where anyone can set up their own “instance.”

The server administrator then takes on the responsibility to manage content and enforce their rules. Each instance is different and comes with its own set of rules.

Stanford’s report, titled “Child Safety on Federated Social Media,” explained how decentralized platforms like Mastodon “offer new challenges for trust and safety” due to a lack of a “central moderation team” dedicated to “removing imagery of violence or self-harm, child abuse, hate speech, terrorist propaganda or disinformation.”

David Thiel, one of the report’s authors, stated that the findings were “an unprecedented sum.”

“We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said Thiel.

“While Mastodon allows user reports and has moderator tools to review them, it has no built-in mechanism to report CSAM to the relevant child safety organizations,” the study stated. “Bad actors tend to go to the platform with the most lax moderation and enforcement policies.”

For years now, concerns over content moderation and safety on social media platforms have been on the minds of policymakers, who have urged platforms like Instagram, TikTok, and Twitter to ensure that CSAM content is not viewable.

Stay informed! Receive breaking news blasts directly to your inbox for free. Subscribe here. https://www.oann.com/alerts





Source link

Be the first to comment

Leave a Reply

Your email address will not be published.


*