
Bluesky Transparency Report Highlights Rising User Reports and Platform Enforcement
Bluesky transparency report reveals how a fast-growing social network is managing safety, compliance, and scale. The company released its first comprehensive transparency report, detailing actions taken by its Trust and Safety team. The report covers moderation, regulatory compliance, account verification, and enforcement trends across 2025.
During the year, Bluesky grew nearly 60%, increasing from 25.9 million users to 41.2 million users. This growth includes accounts hosted on Bluesky infrastructure and those running independent infrastructure under its decentralized AT Protocol. User activity also surged, with 1.41 billion posts made in 2025 alone. That figure represents 61% of all posts ever created on the platform.
As platforms scale, transparency becomes a governance signal. The Bluesky transparency report positions the company’s internal controls and reporting practices as central to sustaining trust at higher volumes.
User Growth and Content Volume Drive Moderation Pressure
Rapid growth directly influenced moderation activity. In 2025, 235 million posts included media, accounting for 62% of all media posts ever shared on Bluesky. As activity increased, user-submitted moderation reports also rose.
The platform recorded 9.97 million user reports in 2025, up 54% from 6.48 million reports in 2024. However, Bluesky noted that this increase closely tracked its 57% user growth during the same period. Approximately 3% of users, or 1.24 million accounts, submitted reports during the year.
The most common report category was misleading content, including spam, which represented 43.73% of all reports. Harassment followed at 19.93%, while sexual content accounted for 13.54%. Other categories formed 22.14% of reports.
Breakdown of Report Categories and Enforcement Signals
Within misleading content, spam alone accounted for 2.49 million reports out of 4.36 million. Harassment reports totaled 1.99 million, with hate speech representing the largest defined share. Other harassment-related activity included targeted harassment, trolling, and doxxing.
Bluesky stated that many harassment reports involved antisocial behavior that did not meet stricter policy categories. This distinction influenced how enforcement decisions were applied.
Sexual content reports totaled 1.52 million, primarily related to mislabeling adult content. Smaller volumes involved nonconsensual imagery, abuse content, and deepfakes. Violence-related reports reached 24,670, spanning threats, glorification of violence, and extremist content.
In parallel, automated systems flagged 2.54 million potential violations, reinforcing the platform’s hybrid approach to moderation.
Platform Design Changes Reduce Antisocial Behavior
The Bluesky transparency report highlighted one operational success. After introducing a system that reduced the visibility of toxic replies by placing them behind an extra click, daily reports of antisocial behavior dropped by 79%.
In addition, reports per 1,000 monthly active users declined 50.9% from January to December. These outcomes suggest that interface-level interventions can materially affect reporting volumes without removing users outright.
For organizations evaluating governance frameworks, such measures offer insight into balancing speech, safety, and user autonomy. Teams exploring similar strategies can review enabling service models through https://uttkrist.com/explore/, where global solutions support governance, compliance, and platform operations across digital ecosystems.
Rising Legal Requests and Account Takedowns
Beyond moderation, Bluesky reported a fivefold increase in legal requests in 2025. The company received 1,470 requests from law enforcement, regulators, and legal representatives, compared with 238 requests in 2024.
Enforcement actions also increased. Bluesky removed 2.44 million items in 2025, including accounts and content. This marked a sharp rise from prior years. The platform issued 3,192 temporary suspensions and enforced 14,659 permanent removals for ban evasion. Most permanent actions targeted inauthentic behavior, spam networks, and impersonation.
Despite higher removals, the company emphasized labeling over expulsion. In 2025, Bluesky applied 16.49 million labels to content, a 200% year-over-year increase. Account takedowns grew 104% over the same period.
This approach reflects a moderation philosophy focused on visibility control rather than blanket exclusion. For enterprises navigating similar trade-offs, https://uttkrist.com/explore/ outlines service pathways that enable scalable policy execution without undermining platform integrity.
Transparency as a Strategic Operating Signal
This is the first time Bluesky has published a report covering moderation, compliance, and verification in a single document. Earlier reports focused narrowly on moderation metrics. The expanded scope signals a shift toward broader accountability as regulatory and user scrutiny increase.
The Bluesky transparency report shows how growth, automation, and policy enforcement intersect at scale. It also underscores the operational costs of decentralization when paired with mainstream adoption.
As digital platforms mature, transparency reporting may evolve from a compliance artifact into a strategic requirement. How will emerging networks balance openness, enforcement, and user trust as activity volumes continue to rise?
Explore Business Solutions from Uttkrist and our Partners’, https://uttkrist.com/explore


