Creator Moderation

Image credit: Unsplash

Content Creator like YouTuber nowadays could be a profitable job for many people who aim to earn money (primarily advertising income from YouTube or Creator Fund from TikTok) from their content creation and sharing. But not all video content is advertiser-friendly or acceptable to communities. When YouTube deems a YouTuber’s video as unacceptable, YouTube will demonetize 💵 💲 a YouTuber’s videos or channels, eventually denying them from earning more future ad revenue through placing limited or no ads on the videos.

Content moderation originally refers to mechanisms of ruling abuse and facilitating cooperation in online platforms, and now thus can become a source of socioeconomic punishment on video sharing platforms like YouTube or TikTok, beyond the suppression of expression online.

This Creator Moderation consists of multiple governance mechanisms managing content creators’ visibility, identity, revenue, labor, and more. Given platformization and monetization of creative labor, video sharing platforms like YouTube and TikTok tend to practice creator moderation through an assemblage of various algorithms (e.g., monetization, content moderation, recommendation algorithms, and more). So, creators may correspondingly experience moderation such as demonetization or shadowban. Thus, there is a need in understanding creators’ moderation experiences.