Public Block and Report Metrics for User Accounts
Public Block and Report Metrics for User Accounts
Many social media platforms struggle with harassment and toxic behavior, which often goes unchecked because consequences for offenders are invisible to others. While users can block or report abusive accounts, these actions happen privately, allowing repeat offenders to continue their behavior without social accountability. One way to address this could be by publicly displaying how often an account has been blocked or reported, creating a visible deterrent against antisocial behavior.
How It Could Work
This idea would introduce two new metrics on user profiles:
- Block Count: The total number of unique users who have blocked the account.
- Report Count: The total number of times the account has been reported, with optional context (e.g., "reported for harassment").
These metrics could appear alongside follower counts but would include a visibility toggle, letting users hide them if desired. To prevent bias, counts would only reflect actions taken after the feature's launch.
Potential Benefits and Challenges
Regular users might benefit from clearer signals about which accounts to avoid, while platforms could see improved community health with less moderation effort. However, there are risks:
- False reporting: Groups might mass-report a user unfairly. One way to counter this could be weighting reports by credibility or requiring moderator review for suspicious spikes.
- Stigmatizing controversial users: Legitimate but polarizing accounts might be unfairly targeted. Allowing users to add context or exclude disputed reports could help mitigate this.
Execution Strategy
A minimal version could start by showing block/report counts only to the account owner, allowing for feedback before making them public. If effective, the feature could roll out as opt-in, then default (with opt-out). Adding explanatory tooltips (e.g., "Blocked by 50+ users in the last 6 months") could help users interpret the data meaningfully.
Unlike existing reputation systems (e.g., Reddit karma or LinkedIn endorsements), this approach focuses on accountability rather than popularity. By making moderation data transparent, it could encourage better behavior without requiring major platform changes.
Hours To Execute (basic)
Hours to Execute (full)
Estd No of Collaborators
Financial Potential
Impact Breadth
Impact Depth
Impact Positivity
Impact Duration
Uniqueness
Implementability
Plausibility
Replicability
Market Timing
Project Type
Digital Product