TikTok is being sued by DOJ and FTC for violating children’s privacy laws.

Facebook
X
LinkedIn
Pinterest
Threads
Email

A lawsuit was filed against the well-known video-sharing platform TikTok by the Federal Trade Commission (FTC) and the U.S. Department of Justice (DoJ) for “flagrantly violating” the country’s laws regarding the privacy of children.

According to the agencies, the company knowingly allowed children to sign up for TikTok accounts and view and send short videos and messages to adults and other users.

They also said that, in violation of the Children’s Online Privacy Protection Act (COPPA), it illegally collected and kept a lot of personal information about these kids without telling their parents or getting their permission.

They added that TikTok’s practices also violated a consent order from 2019 between the company and the government, in which the company promised to notify parents before collecting data from children and remove videos from users under the age of 13.

Under COPPA, online platforms must obtain parental consent before collecting, using, or disclosing personal information about children under the age of 13. Additionally, upon request from parents, it requires businesses to delete all collected information.

“The defendants unlawfully collected and retained children’s email addresses and other types of personal information,” the Department of Justice stated, “even for accounts that were created in ‘Kids Mode,’ a pared-down version of TikTok intended for children under 13.”

“Further, when guardians found their youngsters’ records and requested that the litigants erase the records and data in them, the respondents habitually neglected to respect those solicitations.”

The ByteDance-owned company, according to the complaint, allegedly collected a lot of data on millions of children under the age of 13 to enable targeted advertising, interact with adults, and access adult content.

It additionally blamed TikTok for not practicing sufficient expected level of effort during the record creation process by building secondary passages that made it workable for youngsters to sidestep the age entryway pointed toward screening those under 13 by allowing them to sign in utilizing outsider administrations like Google and Instagram and grouping such records as “age unknown” accounts.

TikTok human reviewers allegedly spent an average of only five to seven seconds reviewing each account to make their determination of whether the account belonged to a child,” the FTC said, adding it will take steps to protect children’s privacy from firms that deploy “sophisticated digital tools to surveil kids and profit from their data.”

It is the latest setback for the video platform, which is already the subject of a law that would force a sale or a ban of the app by early 2025 because of concerns about national security. TikTok has more than 170 million active users in the United States. The company has disputed the allegations. It has asked a federal court to overturn the ban in a petition.

TikTok stated, “We disagree with these allegations, many of which relate to historical events and practices that are factually inaccurate or have been addressed.” We offer age-fitting encounters with rigid shields, proactively eliminate thought underage clients, and have intentionally sent off highlights, for example, default screen time limits, Family Matching, and extra security assurances for minors.”

Child protection has also come under global scrutiny on the social media platform. In September 2023, regulators in the European Union fined TikTok €345 million for violating data protection laws by handling children’s data. The Information Commissioner’s Office (ICO) assessed it a fine of £12.7 million in April 2023 for improperly processing the data of 1.4 million underage users of its platform without parental consent.

The Information Commissioner’s Office (ICO) of the United Kingdom has announced that it has asked 11 media and video sharing platforms to improve their children’s privacy practices or face enforcement action. The offending services’ names were not made public.

“Eleven out of the 34 stages are being gotten some information about issues connecting with default protection settings, geolocation or age confirmation, and to make sense of how their methodology adjusts with the [Children’s Code],” it said. “We are additionally addressing a portion of the stages about designated publicizing to set out assumptions for changes to guarantee rehearses are in accordance with both the law and the code.”

Never Miss An Update
Never miss any important news. Subscribe to our newsletter.
Latest News

Subscribe to our newsletter

Sign up for newsletter and receive exclusive cyber news regularly