Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Bluesky released it on Friday Last year’s moderation report, Mentioning the massive growth of social networks in 2024 and how this has affected the workload of its Trust and Safety team. It also noted that the largest number of reports came from users reporting accounts or posts for harassment, trolling or intolerance — a problem that plagued Bluesky as it grew and even led to it. Mass protests Sometimes on individual moderation decisions.
The company’s report did not explain why no action was taken or why it was not independent Users, who are on Most blocked list.
The company added more than 23 million users in 2024, as Bluesky became a new destination for former Twitter/X users for a variety of reasons. Throughout the year, the social network has benefited from several changes to X, including the decision to change How blocking works and training AI on user data. other users X left after the results of the US presidential electionX owner Elon Musk’s politics are based on how he started dominating the platform. The app also grew in users during X Temporarily banned in Brazil Back in September.
To meet the demand caused by this growth, Bluesky has increased its moderation team to roughly 100 moderators, it said, and continues to hire. The company began offering psychological counseling to team members to help them cope with the difficult task of constantly being exposed to graphic content. (An area we hope AI will one day address, as humans are not built to handle such tasks.)
In total, Bluesky’s moderation service had 6.48 million reports, 17 times more than in 2023 when there were only 358,000 reports.
Starting this year, Bluesky will start accepting moderation reports directly from its app. Like X, it will allow users to more easily track actions and updates. Later, it will also support in-app applications.
When Brazilian users flooded Bluesky in August, the company was seeing as many as 50,000 reports per day, at peak. This resulted in a backlog in resolving moderation reports and required Bluesky to hire more Portuguese-language staff through a contract vendor.
In addition, Bluesky began automating more classification reports out of spam to help combat its influx, though this sometimes led to false positives. Nevertheless, automation has helped bring down the processing time to just “seconds” for “high-verification” accounts. Before automation, most reports were handled within 40 minutes. Now, human moderators are kept in the loop to resolve false positives and appeals, if not always handling the initial decision.
Bluesky said 4.57% (1.19 million) of its active users made at least one sobriety report in 2024, down from 5.6% in 2023. Most of these – 3.5 million reports – were for individual posts. Account profiles have been reported 47,000 times, often for a profile picture or banner picture. The list has been reported 45,000 times; DMs were reported 17,700 times, feed and starter packs received 5,300 and 1,900 reports respectively.
Most of the reports were about antisocial behavior like trolling and harassment — a signal from Bluesky users that they want to see a less toxic social network than X.
Other reports were for the following categories, Bluesky said:
The company also offered an update to its labeling service, including adding labels to posts and accounts. Human labelers added 55,422 “sexual images” labels, followed by 22,412 “rude” labels, 13,201 “spam” labels, 11,341 “intolerant” labels, and 3,046 “threatening” labels.
In 2024, 93,076 users submitted a total of 205,000 appeals against Bluesky’s moderation decisions.
There were also 66,308 account takedowns from moderators and 35,842 automatic account takedowns. Bluesky additionally fielded 238 requests from law enforcement, government and legal agencies. The company responded to 182 of these and accepted 146. Most of the requests were law enforcement requests from Germany, the United States, Brazil and Japan, it said.
Bluesky’s full report also explores other types of issues, including trademark and copyright claims and child protection/CSAM reporting. The agency noted that it has submitted 1,154 confirmed CSAM reports to the National Center for Missing and Exploited Children (NCMEC).