• Post author:
  • Post category:Other
  • Reading time:2 mins read

TikTok’s testing a way for users to toss tomatoes at comments they don’t like.

The ByteDance-owned platform announced today that it’s introducing a private dislike button for users to flag comments they think are “inappropriate or irrelevant” beyond the scope of egregious stuff TikTok’s Community Guidelines already bans, such as hate speech and harassment.

Like Twitter downvotes, dislikes are private. If you dislike a comment, only you will see that dislike. Users won’t be notified when their own comments are disliked, and no one will be able to see if another user’s comment has been disliked.

Also similar to Twitter downvotes, disliking a comment doesn’t mean TikTok will remove it. (Which is why, when it comes to Community Guidelines-violating comments that need to be removed, users should still go the route of TikTok’s report function.)

TikTok says it considers dislikes “community feedback” that “will add to the range of factors we already use to help keep the comment section consistently relevant and a place for genuine engagement.”

The company told TechCrunch the dislike button is currently being tested in some regions, but not in the U.S.

TikTok dropped its quarterly Community Guidelines Enforcement Report today, too

Along with unveiling dislikes, TikTok said it’s additionally experimenting with a reminder syste that will “guide creators to our comment filtering and bulk block and delete options.”

Reminders will be deployed to “creators whose videos appear to be receiving a high proportion of negative comments,” TikTok added.

These announcement are not randomly timed: TikTok today released its latest Community Guidelines Enforcement Report. The report showed that in the fourth quarter of 2021, TikTok removed 85,794,222 videos for violations of its Community Guidelines (around 1% of all videos uploaded to TikTok in that timeframe, it says).

5.7% of those removals were for violations of its policies against harassment and bullying. Another 7.4% was due to violations related to suicide, self-harm, and dangerous acts; 1.5% were removed due to “hateful behavior”; and 0.8% were removed due to “violent extremism.” The largest proportion of removals (45.1%) were due to violations of TikTok’s minor safety policies.

You can read the full report here.

Visit Tubefilter for more great stories.

Source: TubeFilter.com

Leave a Reply