How natural language processing helps promote inclusivity in online communities

by | Nov 16, 2022 | Technology

Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

Presented by Cohere

To create healthy online communities, companies need better strategies to weed out harmful posts. In this VB On-Demand event, AI/ML experts from Cohere and Google Cloud share insights into the new tools changing how moderation is done.

Watch free, on-demand!

Game players experience a staggering amount of online abuse. A recent study found that five out of six adults (18–45) experienced harassment in online multiplayer games, or over 80 million gamers. Three out of five young gamers (13–17) have been harassed, or nearly 14 million gamers. Identity-based harassment is on the rise, as is instances of white supremacist rhetoric.

Event
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

It’s happening in an increasingly raucous online world, where 2.5 quintillion bytes of data is produced every day, making content moderation, always a tricky, human-based proposition, a bigger challenge than it’s ever been.

“Competing arguments suggest it’s not a rise in harassment, it’s just more visible because gaming and social media have become more popular — but what it really means is that more people than ever are experiencing toxicity,” says Mike Lavia, enterprise sales lead at Cohere. “It’s causing a lot of harm to people and it’s causing a lot of harm in the way it creates negative PR for gaming and other social communities. It’s also asking developers to balance moderation and monetization, so now develop …

Article Attribution | Read More at Article Source

Share This