TikTok violated FTC children’s privacy orders, complaint says

Singapore boosts social distancing – with a robot dog
May 13, 2020
“Minecraft” is for learning? In COVID time, 50 million downloads of free educational content
May 18, 2020

[ad_1]

A coalition of child and consumer privacy advocates filed a complaint Wednesday evening with the Federal Trade Commission alleging that TikTok continues to illegally store and collect children’s personal information despite a prior agreement to shut down.

Last year, the FTC struck a deal with Musical.ly, the company now known as TikTok, which included a $ 5.7 million fine – its biggest ever for a violation of children’s privacy. The agency accused the company of illegally collecting sensitive personal data from children, such as email addresses, names, photos and locations without parental permission through its app, and denying parents’ requests to delete their children’s data.

In the new complaint, 20 groups, including the Campaign for an Ad-Free Childhood and the Center for Digital Democracy, accuse the company of violating the terms of the this agreement with the FTC as good as VSChildren’s Online Privacy Protection Act more generally. Organizations allege TikTok failed to comply with FTC order to destroy all personal information it stores about users under 13 – including data on users under 13 at the time where the data was collected but which are now older.

Owned by Chinese company ByteDance, TikTok allows users to create and share short videos of themselves with millions of users. In a statement, a spokesperson for TikTok said, “We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users.”

TikTok offers accounts for users under 13 that collect less data and prevent kids from sharing videos with others. But these accounts still collect personal IDs and data on user activity, advocacy groups say, and kids can just lie about their age to access TikTok without restrictions.

The groups said in the complaint that they found many regular TikTok accounts run by young children with videos uploaded as early as 2016. Regular accounts collect a wide range of data that is shared with third parties and used for targeted advertising. .

“It is now clear that a year later they did not do the things the FTC told them to do properly,” said Michael Rosenbloom, member of the Institute for Public Representation, a technology law clinic. from Georgetown University which represents advocacy groups the FTC complaint.

The Children’s Privacy Protection Act 1998 states that online services aimed at children or having “real knowledge” of children use them to obtain parental permission before collecting personal data from users under the age of 18. 13 years. But even in limited accounts for young users, TikTok hasn’t created a real mechanism for obtaining parental consent, Rosenbloom said.

A Dutch privacy watchdog said last week it would investigate how TikTok handles children’s data and whether parental consent is needed to collect and use that data. The company said it is cooperating with the Dutch authorities, according to Reuters.

The company has announced new features that could change the way some teenage users navigate the platform. On April 15, TikTok announced that it would deploy a new set of parental controls to its platform. The ‘family matchmaker’ tool would allow parents to log into accounts of users aged 13-16 and manage how much time teens can spend on the app each day, limit content that might be inappropriate and restrict who can send direct messages to an account. The company said that starting April 30, it will automatically turn off direct messages for users under the age of 16.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *