GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Twitch Releases Its First Transparency Report, But Streamers Still Have Questions

The report contains a lot of data, but leaves many questions unanswered.

2 Comments

Twitch has released the first Transparency Report of what is set to be many, with the streaming company promising to release the reports twice a year. The report gives a top-down look at Twitch's safety and moderation systems, before looking into more granular data about enforcement on Twitch.

The report breaks safety into multiple tiers, with the broadest level being Twitch's community guidelines, followed by site-wide moderation, then channel-specific moderation, then the small suite of safety tools that are given to viewers.

The role of channel mods and creators is emphasized in the site's overall plan, and Twitch brags that 95% of channels now have either human moderators or AutoMod enabled--up from 93% in the first half of last year. It also shows an increase in messages being deleted, with the number per 1000 messages up from 3.2 to 4.

Twitch attruibutes much of the increase to changes it's made in the last year, including the rollout of the ModView dashboard in March 2020, and a change in the second half of last year that enabled AutoMod by default for new channels without a human moderator assigned.

While these changes have made life a little easier for creators and their mods, it's the Twitch-side moderation that most of the site's users have questions about. Twitch says it has increased its number of "content moderation professionals" by four times in the last year, though it still doesn't say how many are currently employed by Twitch--or specify whether they are in fact Twitch employees or contracted from an outside company as other social media platforms are known to do.

Twitch says that these professionals "work across multiple locations, and support over 20 languages, in order to provide 24/7/365 capacity to review reports as they come in across the globe," and also devotes a small part of the report to outlining some of the systems in place to reduce the harm on those workers of constantly viewing harmful content for moderation.

The report goes into granular detail on Twitch-side enforcements, including warnings issued, temporary suspensions, and permanent bans for infringements including hateful conduct, harassment, violent conduct, nudity, adult conduct, spam, and terrorist activity. Overall, it says enforcements have increased from 0.099 to 0.114 actions per thousand hours watched.

While the report gives a huge amount of data, it's lacking the answers to a number of questions streamers and viewers most wanted answered. Replies to Twitch's tweet of the report showed many users didn't think it was enough. Streamers asked to know why Twitch's Community Guidelines were often unevenly applied, with others asking for transparency about why high-profile streamer Dr Disrespect was banned--with even the streamer himself allegedly never being given a reason for the ban.

The report is just one of a number of measures Twitch instigated last year, after the streaming community complained of abusers using their platform to harass others.

"This report is the first of its kind for Twitch so we will look closely at the feedback we receive to inform how we can refine these reports moving forward," the blog post accompanying the report reads, directing users to UserVoice to submit their feedback. It looks like Twitch will get a lot of it on this report, so we'll have to wait and see what changes are made in time for 2021's half-year report.

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 2 comments about this story