Bluesky’s 2024 moderation report shows how quickly harmful content grew as new users flocked in

MT HANNACH
2 Min Read
Disclosure: This website may contain affiliate links, which means I may earn a commission if you click on the link and make a purchase. I only recommend products or services that I personally use and believe will add value to my readers. Your support is appreciated!

Bluesky has seen explosive growth in the last year, which requires the platform to step up its moderation efforts. In his recent publication moderation report For 2024, Bluesky said it grew by about 23 million users, from 2.9 million users to almost 26 million. And its moderators received 17 times more user reports than in 2023: 6.48 million in 2024, compared to 358,000 the previous year.

The bulk of these reports concerned “harassment, trolling or intolerance”, spam and misleading content (including impersonation and misinformation). The presence of accounts pretending to be other people following Bluesky’s peak in popularity, and the platform with a “more aggressive” approach to try to suppress it. At the time, he announced that he had quadrupled his moderation team. The new report states that Bluesky’s moderation team now numbers around 100 people and recruiting is underway. “Some moderators specialize in particular policy areas, such as dedicated child safety officers,” he notes.

Bluesky says it has received numerous reports about other categories, including “illegal and urgent issues” and unwanted sexual content. There were also 726,000 reports marked as “other.” Bluesky says it responded to 146 requests from “law enforcement, governments and legal firms” out of a total of 238 last year.

The platform plans to make some changes to the way reports and appeals are handled this year, which it says will “streamline communication with users,” such as providing users with updates on actions it took on the content they reported and, later, allow users to appeal takedown decisions directly in the app. Moderators removed 66,308 accounts in 2024, while its automated systems removed 35,842 spam and bot profiles. “Looking ahead to 2025, we are investing in stronger proactive detection systems to complement user reporting, as a growing network needs multiple detection methods to quickly identify and address harmful content,” Bluesky said.

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *