Home / Articles / Twitch builds toward a ‘layered’ safety approach with new moderator tools

Twitch builds toward a ‘layered’ safety approach with new moderator tools

Twitch builds toward a ‘layered’ safety approach with new moderator tools Image
  • Posted on 21st Jul, 2022 19:03 PM
  • 1288 Views

Moderating an online community is hard, often thankless work — and it’s even harder when it happens in a silo. On Twitch, interconnected channels already informally share information on users they’ve prefer to keep out. The company is now formalizing that ad hoc practice with a new tool…

Moderating an online community is hard, often thankless work — and it’s even harder when it happens in a silo.

On Twitch, interconnected channels already informally share information on users they’ve prefer to keep out. The company is now formalizing that ad hoc practice with a new tool that lets channels swap ban lists, inviting communities to collaborate on locking serial harassers and otherwise disruptive users out before they can cause problems.

In a conversation with TechCrunch, Twitch Product VP Alison Huffman explained that the company ultimately wants to empower community moderators by giving them as much information as possible. Huffman says that Twitch has conducted “extensive” interviews with mods to figure out what they need to feel more effective and to make their communities safer.

Moderators need to make a ton of small decisions on the fly and the biggest one is generally figuring out which users are acting in good faith — not intentionally causing problems — and which ones aren’t.

“If it’s somebody that you see, and you say ‘Oh, this is a slightly off color message, I wonder if they’re just new here or if they are bad faith’ — if they’ve been banned in one of your friend’s channels, it is easier for you to go, ‘yeah, no, this is probably not the right person for this community,’ and you can make that decision easier,” Huffman said.

“That reduces the mental overhead for moderators, as well as more efficiently gets someone who’s not right for the community out of your community.”

Within the creator dashboard, creators and channel mods can prompt other channels they’d like to trade lists of banned users with. The tool is bi-directional, so any channel that requests another streamer’s list will be sharing theirs in return. A channel can accept all requests to share ban lists or only allow requests from Twitch Affiliates, Partners and mutually followed channels. All channels will be able to swap ban lists with up to 30 other channels, making it possible to build a pretty robust list of users they’d prefer to keep out, and channels can stop sharing their lists at any time.

Twitch shared ban list

Channels can choose to either automatically monitor or restrict any account that they learn about through these shared lists, and they’ll be restricted by default. Users who are “monitored” can still chat, but they’ll be flagged so their behavior can be watched closely and their first message will be highlighted with a red box that also displays where else they’ve been banned. From there a channel can opt to ban them outright or give them the all-clear and switch them to “trusted” status.

Twitch’s newest moderation tools are an interesting way for channels to enforce their rules against users who might prove disruptive but potentially stop short of breaking the company’s broader guidelines prohibiting overt bad behavior. It’s not hard to imagine a scenario, particularly for marginalized communities, where someone with bad intentions could intentionally harass a channel without explicitly running afoul of Twitch’s rules against hate and harassment.

Twitch ban evasion and shared ban list

Twitch acknowledges that harassment has “many manifestations,” but for the purposes of getting suspended from Twitch that behavior is defined as “stalking, personal attacks, promotion of physical harm, hostile raids, and malicious false report brigading.” There’s a gray zone of behavior outside of that definition that’s more difficult to capture, but the shared ban tool is a step in that direction. Still, if a user is breaking Twitch’s platform rules — and not just a channel’s local rules — Twitch encourages a channel to report them.

“We think that this will help with things that violate our community guidelines as well,” Huffman said. “Hopefully, those are also being reported to Twitch so we can take action. But we do think that it will help with the targeted harassment that we see impacting, in particular, marginalized communities.”

Last November, Twitch added a new way for moderators to detect users trying to skirt channel bans. That tool, which the company calls “Ban Evasion Detection,” uses machine learning to automatically flag anyone in a channel who is likely to be evading a ban, allowing moderators to monitor that user and intercept their chat messages.

The new features fit into Twitch’s vision for “layered” safety on its platform, where creators stream live, sometimes to hundreds of thousands of users, and moderation decisions must be made in real-time at every level.

Twitch builds toward a ‘layered’ safety approach with new moderator tools View Story

Latest 20 Post