Three years since TikTok was launched, the video-sharing social network grown rapidly to accumulate more than two billion downloads, one of the most popular apps of the moment, beating even Instagram or YouTube in consumption time in the United States, United Kingdom or Spain, particularly among younger age segments.
TikTok might be described as a repository of canned content to be conveniently remixed with user-created videos, a viral meme generating machine that makes users feel like rock stars, and has even been used as a way to coordinate protests against Donald Trump by sinking his COVID-19 comeback rally last week in Tulsa, as well as launching denial of inventory attacks on the US president’s merchandising site.
And of course if you dare to criticize TikTok, you’ll simply be told that you’re too old to understand it, that young people have different criteria to yours. Which is all well and good, but what is TikTok? How reliable are the criteria of young people?
TikTok’s story is like many others on the web: an app to record short music videos using simple editing tools and offering all kinds of incentives and competition goes viral and is downloaded in record numbers in several countries, becoming the most valuable startup in the world, ahead of Uber.
Along the way, an acquisition and consolidation, a lot of expectation, from competing in some areas with the all-powerful WeChat super-app to becoming, according to some, the future of the music industry.
So far, so good. Any problem? Well, there is the small matter that many TikTok users are girls, 13 or less who record themselves dancing and lip syncing to their favorite songs, often trying to be the most provocative or daring. Sure, data is hard to find: these apps publish very little data on their demographics and the under-13 segment tends to lie when they register or don’t even have their own profile in the app stores or use devices in their parent or siblings’ name. But in the case of TikTok all you have to do is look at many of its videos. Now, if you put a whole bunch of videos of pubescent girls dancing to their favorite music on a social network with recommendation tools, it’s going to become a magnet for sexual predators who are likely to try to contact them through the app’s chat feature; what’s more, it even helps users find videos of a certain type.
The authorities have finally reacted: the FTC has already fined TikTok $5.7 million for storing profiles and personal information of children aged under thirteen without their parents’ consent, as well making those profiles public and even for allowing, until October 2016 to share their location. What could possibly go wrong? Now the UK authorities have begun to investigate TikTok for the same reasons. The app has already been banned in Indonesia and India.
Right after being fined by the FTC, the company published a note and updated the app with an age gate: all users will need to verify their age, and the under 13-year-olds will then be directed to a separate, more restricted in-app experience that protects their personal information and prevents them from publishing videos to the platform. Guess what’s going to happen when kids realize that the app does not allow them to publish their videos? Easy as one, two, three: they will simply lie about their age.
TikTok, or its Chinese parent company, ByteDance, probably isn’t worried about this or even about the fines: it makes vast amounts of money from purchases made on the app of the emojis users share on the videos of other users.
What are parents supposed to do? Force their children to uninstall TikTok from their smartphones? In all honesty, I think that’s going too far. It’s important to be aware of the possible risks involved with these kinds of apps, but even more important is the need to educate our children about responsibility and these new communication tools, about the need to respond rapidly to any kind of unwanted communication and to report anything that could constitute a criminal offense.
When it comes to our children, safety is the watchword, but trying to lock them away from the world will only make them more curious about what’s out there: instead, we must treat the internet and social networks in the same way we do other potential risks, in an even-handed way. At the same time, we have the right to expect the owners of the platforms our children use to be responsible for any possible misuses, which in some cases, they encourage.
Since then, the US armed forces have forbidden personnel from using it and describes it as a threat to cybersecurity. Israeli cybersecurity company Check Point has investigated it and concludes it has backdoors and major vulnerabilities, as well as overall security issues. The US government is also investigating it. Meanwhile, Reddit CEO and co-founder Steve Huffmandescribes it as a “fundamentally parasitic app that is always listening” and warns against installing what he calls “spyware”. Several child advocacy groups say it poses a clear risk to children. Apple claims it has caught TikTok using clipboard capture mechanisms to spy on millions of users.
Other investigations reveal that its content censorship standards are decided by the Chinese government and are clearly discriminatory. A cybersecurity expert who has reverse engineered the app warns people to stay away from it. In short, it’s not hard to find evidence of the problematic nature of TikTok. And yet it thrives, a time bomb in the making.
Why are so many people and institutions attacking a seemingly innocent app? Are we critics just a bunch of out of touch old fogeys?
I don’t believe so: as I said a year ago, TikTok is a lesson in irresponsibility, dangerous by design. And not simply by carelessness, mistake or default: this is a deep and patent irresponsibility, a philosophy focused on the constant capture of all kinds of user data. In short, not recommendable for children or adults, particularly thanks to its sinister and addictive content recommendation system. And now under the benevolent guise of a Western CEO formerly at Disney.
All tools can be adapted to almost any use. Many young and not-so-young people who use TikTok today consider it fun, a fad, a way of expressing themselves, or even a vehicle for activism. But it’s not that, or at least it’s not just that. It’s the application of a philosophy on the internet — we want to see everything, know everything, analyze everything without limits. It’s taken many years to recognize Facebook for what it is and to try to bring it into line through boycotts: we should act now to limit TikTok and any malevolent activities.
Have no illusions: beneath its seemingly innocent exterior, TikTok at this stage could be very well pose a public danger. If you know nothing about cybersecurity, trust the number of analysts who have been saying it for a while. Or ask the Indian Government.
About the Author
This article was written by Enrique Dans, professor of Innovation at IE Business School and blogger at enriquedans.com.