Content removed? TikTok explains why now

TikTok wants to make the app more transparent and is now rolling out a notification system for creators that should clarify why content has been removed from the platform.

TikTok is introducing a new process so that creators know why their content was removed in the future. The short video app is now rolling out a notification system that informs TikToker which of the platform guidelines they have violated. TikTok would like to make the app more transparent for all users. The social media platform explains:

For the past few months, we’ve been experimenting with a new notification system to bring creators more clarity around content removals. Our goals are to enhance the transparency and education around our Community Guidelines to reduce misunderstandings about content on our platform, and the results have been promising.

TikTok also states that new violations of the guidelines have decreased after the users received a notification. The company website states:

We’ve also seen a 14% reduction in requests from users to appeal a video’s removal. We believe this helps foster greater understanding of the kind of positive content and welcoming behavior that makes our community thrive.

Violations of TikTok’s guidelines are now explained

If a clip violates one of the guidelines of the short video app, the creator receives a message. This explains which TikTok guideline was violated with the content and therefore removed. The TikTokers are also shown a link to the entire guidelines of the social platform. In this way, users can find out more about the regulations in the app. It is also possible for creators to lodge complaints within the new notification system. For this the TikTokers only have to click on the “Submit an appeal” button. In addition, TikTok offers further help if a user posts clips of, for example, self-harm. Users then receive a separate message that shows them who they can contact.

TikTok's new notification system shows why the content was removed
TikTok’s new notification system shows which guideline has been violated, © TikTok

TikTok removes content to protect users – always right?

TikTok has been criticized several times in the past for its own moderation guidelines. Because initially the social platform not only removed potentially dangerous content. Clips from older, overweight, disabled or poor users were also deleted or their range restricted. TikTok has continuously adapted the moderation guidelines over the past two years. The short video app also tries to be as transparent as possible regarding its own guidelines. The introduction of the new notification feature should be a further step in this direction.

Finally do it instead of talking about it. Iroin’s whitepaper gives you a comprehensive overview of the status of influencer marketing and gives you concrete tips on how to run influencer campaigns profitably. Download now for free!



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *