Mourning parents asked TikTok for age verification, got maturity ratings instead

Enlarge (credit: Sol de Zuasnabar Brebbia | Moment)

TikTok’s safety features recently became the focus of a lawsuit filed by parents who claim that the app’s addictive design is responsible for the deaths of at least seven children, six of whom were too young to be on TikTok. Those parents suggested that TikTok take steps to protect young users, urging the platform to add an age verification process to restrict content or terminate the accounts of child users under the age of 13—the minimum age required to join TikTok.

That’s not the direction TikTok has decided to go, though. At least, not yet. Instead, TikTok announced on Wednesday that it is adding new safety measures for all users designed to limit exposure to harmful content and give users more control over what shows up in their feeds. That includes giving users the power to block content containing certain words, hashtags, or sounds.

Specifically focusing on improving safety measures for TikTok’s “teenage community members,” TikTok is also “working to build a new system to organize content based on thematic maturity”—essentially, creating maturity ratings for TikTok videos, like the ratings you see on movies or video games.

Read 14 remaining paragraphs | Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Generated by Feedzy