Many social media platforms struggle with harassment and toxic behavior, which often goes unchecked because consequences for offenders are invisible to others. While users can block or report abusive accounts, these actions happen privately, allowing repeat offenders to continue their behavior without social accountability. One way to address this could be by publicly displaying how often an account has been blocked or reported, creating a visible deterrent against antisocial behavior.
This idea would introduce two new metrics on user profiles:
These metrics could appear alongside follower counts but would include a visibility toggle, letting users hide them if desired. To prevent bias, counts would only reflect actions taken after the feature's launch.
Regular users might benefit from clearer signals about which accounts to avoid, while platforms could see improved community health with less moderation effort. However, there are risks:
A minimal version could start by showing block/report counts only to the account owner, allowing for feedback before making them public. If effective, the feature could roll out as opt-in, then default (with opt-out). Adding explanatory tooltips (e.g., "Blocked by 50+ users in the last 6 months") could help users interpret the data meaningfully.
Unlike existing reputation systems (e.g., Reddit karma or LinkedIn endorsements), this approach focuses on accountability rather than popularity. By making moderation data transparent, it could encourage better behavior without requiring major platform changes.
Hours To Execute (basic)
Hours to Execute (full)
Estd No of Collaborators
Financial Potential
Impact Breadth
Impact Depth
Impact Positivity
Impact Duration
Uniqueness
Implementability
Plausibility
Replicability
Market Timing
Project Type
Digital Product