Millions of people around the world come to TikTok to create, share, discover, and connect. We are committed to creating a safe and inclusive platform for our community, and believe this work is especially important for our teenage users, ages 13-17. Our comprehensive approach to youth safety and well-being includes robust policies, innovative technologies, in-app features, and educational resources. We have a dedicated team of experts in child and adolescent development who help ensure that our Youth Safety and Well-Being policies are age-appropriate and developmentally suitable. We also implement stringent advertising policies to ensure that ads on our platform are age-appropriate.
You must be at least 13 years old to have an account on TikTok (or other ages as indicated in our Privacy Policy and Terms of Service). We work collaboratively with industry partners, non-profits, academics, and governments to identify, implement, and share innovative solutions to better protect teens online. To ensure our work is informed by the insights and recommendations of young people themselves, we also launched TikTok's pioneering global Youth Council in partnership with specialist online safety agency Praesidio Safeguarding.
These are some of the deliberate decisions we've taken to build age-appropriate and developmentally suitable experiences that enable teens to have a safe space to create, share, discover, and connect:
Safeguards for all teen accounts (ages 13-17) |
Daily screen time is set by default to 60 minutes which helps young people to be intentional about the time they spend on TikTok. |
Accounts are private by default. This limits who can follow the account, view their videos, read their bio, and interact with their videos. |
Hosting LIVE content is not permitted. While teens can watch LIVE content, they can’t host a LIVE. |
No financial transactions. Teens can’t send or receive virtual gifts or buy or sell products on TikTok Shop. |
No push notifications at night. Push notifications are disabled from 9pm for teens 13-15, and from 10pm for teens 16-17. |
“Suggest Your Account to Others” is turned off by default. If turned on, teen accounts are also not recommended to adults. |
Age-restricted effects. Restricting the use of some appearance effects for teens under 18 |
Additional safeguards for younger teens (ages 13-15) |
No content in the For You feed. Content created by those under 16 is not eligible to be shown in the For You feed. |
No direct messaging. Direct messaging is only available to accounts 16 and older. |
No Duet and Stitch. No one can Duet or Stitch videos published by an account of someone under 16. |
No downloading their videos. This control is set to Off by default and can’t be changed, even if a teen chooses to have a public account. |
Friends-only comment settings. Even with a public account, teens under 16 can’t change their comment permissions to allow ‘Everyone’ to comment on their content. |
Our Content Levels system builds on these protections and is designed to prevent certain content with more mature or complex themes from reaching younger audiences. When we detect content that contains overtly mature or complex themes, a content level is allocated to the post to help prevent those under 18 years old from viewing it across the TikTok experience.
We’re invested in supporting our community's well-being and empowering families and young people to manage their own experience.
Family Pairing is designed to empower parents, guardians, and young people to customize their safety and content settings based on individual needs. It is part of our continued work toward providing parents and guardians with a better ability to guide their teen's online experience and to educate them about online safety and digital citizenship.
Family Pairing lets parents and guardians link their TikTok account to their teen's account to enable a variety of content and privacy settings, including:
We encourage caregivers to discuss the Family Pairing features with their teens and explain why they choose to turn them on. Even without Family Pairing enabled, parents and guardians can help their teens enable our app’s screen time offerings, including Daily Screen Time and Restricted Mode settings.
Learn more about Family Pairing on TikTok.
For everyone, regardless of age, we offer screen time management tools including:
Learn more about screen time on TikTok.
The goal of the For You feed is to provide original content that honors our mission of inspiring creativity and bringing joy. Our recommendation system is designed with user safety as a primary consideration, meaning that some content is not eligible for recommendation.
We continually invest in tools that help people create the best TikTok experience for them. We offer a feature that lets people refresh their For You feed if their recommendations no longer feel relevant. If someone refreshes their feed, our recommendation system will then begin to surface more content based on new interactions.
People can also choose to automatically filter out content that uses specific hashtags or phrases from their For You feeds, select “Not interested” to skip future content from a particular creator, or content that uses a particular sound.
Learn more about how TikTok recommends content.
TikTok has an abundance of resources to support young people on TikTok. Many of these can be found in our Safety Center, including the dedicated Well-being guide and Teen Safety Center guide, Help Center, and Privacy Center. We also aim to provide parents and guardians with resources they can use to have conversations with their family about digital safety and decide the most appropriate experience for them, such as the Guardian’s Guide, and Family Pairing features.
As stated, you must be at least 13 years old to have an account on TikTok. Our neutral, industry-standard age gate requires people to fill in their complete date of birth without any pre-populated minimum age. TikTok has a 12+ rating in the App Store (Apple) and is listed as Parental Guidance Recommended in Google’s Play Store. These ratings mean parents can use device-level controls to block their child from downloading the app.
In the US, if someone enters a birthday that confirms they are under 13 years old, they will be directed to a curated viewing experience with additional safeguards and privacy protections designed specifically for this audience.
Age assurance, which encompasses a range of methods for estimating or verifying age, continues to be one of the most complex policy areas that online platforms, policymakers and regulators grapple with. To advance industry-wide discussions, we launched a global Multi-Stakeholder Dialogue with WeProtect Global Alliance and the Centre for Information Policy Leadership, along with industry peers. This project brings together online platforms, regulators, policymakers, privacy and child rights organizations to discuss industry-wide approaches to age assurance that help make the online world safer for young people while also respecting their fundamental rights.
While most people understand the importance of being truthful about their age, some do not provide the correct information, which is a challenge many online services face. This is one reason why our commitment to enforcing our minimum age requirements does not end at the age gate and we take a number of additional approaches to identify and remove suspected underage account holders. We use information, such as keywords and in-app reports from our community, to help detect potential underage accounts. Our safety team is also trained to identify accounts that may belong to an underage person. If we believe an account belongs to a person under the age of 13, we remove it and people are provided with an opportunity to appeal if they believe we have made a mistake. Also, if we determine that an account with a different stated age, in fact, belongs to a young person under 18, we will tag the account so that it will benefit from additional age-appropriate safeguards. Information regarding removals of suspected underage accounts is shared as part of our quarterly transparency reporting.
TikTok does not tolerate child sexual exploitation and abuse (CSEA), which includes child sexual abuse material (CSAM), grooming, pedophilia, sextortion, sexual solicitation, and sexual harassment of young people under the age of 18. This includes content that is AI-generated. When we become aware of suspected CSEA, whether through our own proactive detection methods, community reporting, or industry partnerships, we take immediate action to remove it, permanently ban accounts, and submit reports to the National Center for Missing and Exploited Children (NCMEC), a US-based nonprofit that refers cases to law enforcement globally. TikTok publishes bi-annual reports on the number of CSEA reports we make to NCMEC through their dedicated CyberTipline.
For more information on our comprehensive approach and extensive partnerships, please see Combating child sexual exploitation and abuse.