Take the current discourse across the Diddy case. It uncovered severe fault strains: how unchecked misogynoir, banter tradition, and meme-driven abuse flourish even in areas created by and for Black folks. So we’ve to ask: are platforms and regulators actually able to cope with that complexity? Are AI instruments and human moderators being correctly skilled to identify dangerous narratives with out reinforcing bias? Are we constructing methods that perceive Black communities — or just ones that monitor us? Till we cease treating Black girls’s security as a facet observe — each on-line and offline — we’ll maintain mistaking management for care.
What does pleasure and justice seem like on-line?
Pleasure and justice on-line occur when belief and security are handled as enterprise fundamentals — not afterthoughts. When customers really feel protected to precise themselves, to be absolutely human, to make errors, to be impressed by wholesome function fashions and uplifting dialog — they keep longer, spend extra, and grow to be model advocates.
Platforms that optimise for time nicely spent quite than uncooked watch-hours are inclined to see increased retention and decrease churn. Pinterest, for instance, lowered self-harm searches by 80% after shifting to wellbeing-first metrics — and every day lively customers elevated. The takeaway? More healthy feeds imply more healthy audiences that keep.
In 2021, as Founding father of Glitch, I partnered with BT Sport and EE to launch the award-winning Draw the Line marketing campaign. We mixed real-time abuse detection with sensible person actions: Spot. Report. Help. The marketing campaign generated practically £1 million in earned media and led to a 25% drop in abusive tweets inside 48 hours of launch.
Justice on-line means moderating each context and beliefs. AI fashions skilled with dialect-rich datasets can cut back false positives for Black British vernacular — safeguarding freedom of expression whereas nonetheless tackling dangerous content material and detrimental stereotypes. I personally lengthy for the day algorithms cease perpetuating misogynoir and misogynistic content material — both by failing to take it down or, worse, amplifying the “sturdy, offended Black lady” trope. Balanced methods keep away from over-policing marginalised communities and assist manufacturers avoid the reputational crises that comply with when security misfires.
How will we defend wellbeing?
Shift the success metrics. Observe “constructive engagement” like saves and significant feedback, not simply rage clicks.
Price range for care. Ring-fence at the least 5% of marketing campaign spend for security tooling, moderator coaching, and group schooling.
Design deliberate pauses. Introduce friction factors like every day scroll limits and wellbeing nudges that encourage relaxation with out harming income.
Co-create with customers. Carry these most affected by on-line harms into your product labs. Their perception reduces rework and reputational danger.
Audit. Publish. Enhance. Deal with wellbeing metrics like carbon information: disclose them, benchmark progress, and iterate transparently.
When manufacturers deal with digital wellbeing as a development lever, pleasure and justice cease being nice-to-haves. They grow to be strategic benefits — powering extra resilient platforms, stronger person communities, and a more healthy web for the subsequent era.
What wants to vary for girls — particularly Black and marginalised girls — to take part as freely as males?
First, I feel we have to be curious concerning the query itself. If the benchmark is to be as free as males, we’re aiming too low. Patriarchy doesn’t simply prohibit girls, it additionally distorts how masculinity and gender-expansive identities are expressed on-line. It punishes vulnerability, narrows self-expression, and fuels dangerous dynamics throughout the board.





