
Having to deal with data probes isn’t exactly a positive thing. It can be stressful and confusing, but it’s a necessary evil. That’s especially true if you are a tech company, as you have to contend with data privacy laws and regulations, and the potential for fines. You also have to ensure that your data partners are doing what they can to ensure the safety of your users. The problem is that many tech companies have yet to take these important steps. Despite this, they’re still using SCCs (Self-Certified Certificates) to guarantee that their data partners are doing what they can. However, these companies haven’t been providing equal protection to users across the world.
China’s ownership threatens to boil over
Among the major questions surrounding TikTok are whether the Chinese government has tapped into the app’s user data and whether Chinese companies are cooperating with government surveillance requests. These revelations will bring new urgency to regulators on both sides of the Atlantic.
TikTok is owned by Chinese company ByteDance. It’s a short-form video service that has become a goofy meme. It has around 700 million users globally. The company has been in negotiations with Oracle Corp., according to reports. The company plans to store American user data on Oracle’s servers in the U.S.
Chinese laws could force tech companies to turn over user data to the Chinese government. This could include individual users’ data, which the Chinese government could use to create misinformation campaigns.
TikTok’s executives have said that they are not under pressure to hand over user data to the Chinese government. They also claim that they operate independently from the company’s Beijing headquarters. They said that decisions about data security are made by executives in Dublin.
TikTok has long denied any ties to the Chinese government. The company has repeatedly told users that their data is secure. It has said that it would never censor content unless the Chinese government asked for it.
However, a report by BuzzFeed revealed that TikTok engineers and moderators are located in China. Chinese government officials have back-channeled surveillance requests to Chinese companies.
A former TikTok employee has alleged that TikTok has been asked by the Chinese government to censor videos that discuss Tiananmen Square. An internal report from the company also revealed that TikTok moderators were told to censor videos. This is in stark contrast to TikTok’s claim that they are not involved in China’s government’s surveillance efforts.
It’s not giving children around the world equal levels of protection
Despite the ubiquity of TikTok, its ubiquity hasn’t been matched by its safety. In the UK, for example, a child is unlikely to see or even know that his profile has been snipped from a family album despite the fact that he has been on the site for more than a year. The same can’t be said for users in other countries such as Ireland.
TikTok has also been the target of numerous regulatory investigations ranging from its role as a social network to its role as a repository for children’s sensitive data. TikTok’s stock has risen dramatically over the past year, with Chinese investors now controlling the company in a big way. The company has seen its share of controversy, and has faced the gauntlet of legal actions including a class action suit and some of the aforementioned regulatory investigations. This has prompted some European lawmakers to feel smug about the higher standard of child protection in their respective nations. In the long run, TikTok has to tread a fine line between giving kids what they want and keeping them out of harm’s way.
TikTok has also been a recipient of several awards including a prestigious MIT Technicolor award for its video streaming technology, a patent for its nifty little facial recognition feature and the latest edition of Forbes magazine naming the company as one of the world’s biggest tech companies.
Age-gating technology can endanger the physical and mental health of young users
eSafety has deemed age-gating technology worthy of a mention as one of its five domains of interest. This is a laudable endeavor, and eSafety has a proven track record of protecting and educating our most vulnerable members. Specifically, eSafety is in the business of fostering the growth of responsible technology and content consumption amongst Australians of all ages. Its mission is to ensure that all Australians enjoy the benefits of a modern digital society. eSafety has launched a number of initiatives in the past year and continues to be a leader in digital safety. The company has a portfolio of regulated electronic communications products and services, and is well positioned to deliver a digital experience to suit the needs of all Australians. Its digital offerings span the gamut from online gambling to social media services.
eSafety recently launched the world’s first e-Safety Research Center, which will provide a forum for the sharing of eSafety research findings across a range of industry sectors. It has already commenced a series of consultation sessions with key industry players, and will be kicking off a number of new projects aimed at furthering the objectives of its e-Safety research division.
It uses SCCs to guarantee data partners are taking measures
During the implementation of the European General Data Protection Regulation (GDPR), many companies have started to use Standard Contractual Clauses (SCCs) to guarantee data partners are taking appropriate measures to protect users’ data. The European Commission has recently published a Frequently Asked Questions document on SCCs, providing practical guidance on the application of SCCs. The document covers transfers within the European Economic Area (EEA) as well as transfers outside the EEA. The Commission plans to update the document on an ongoing basis, based on feedback from stakeholders. It includes 44 questions and answers, and is intended to offer practical guidance.
SCCs are legally binding agreements, but they do not have formal signature requirements. However, the parties may sign the agreements by electronic means, such as by signing annexes to the SCCs. The parties can also include the SCCs in commercial contracts, by reference. It is important to note that SCCs may not include general exculpation from liability. However, parties may include them in contracts in accordance with civil law requirements from the jurisdiction where the contract is made. It is also important to note that a contract may contain click-through buttons, but these cannot be used because the emphasis is on signing.
In addition to using SCCs, TikTok must also provide the appropriate safeguards to meet the requirements of the EU standard. These can be in the form of binding corporate rules or Standard Contractual Clauses. If the SCCs include general exculpation from liability, this will only be applicable if the party is subject to GDPR. It is also important to note that an additional set of transfer SCCs is being developed by the Commission.
It faces fines if found at fault
During the last year, TikTok has attracted a lot of attention. Its video app has gained a huge following, but it has also been criticized for censoring creators. TikTok has also received bad press for challenges that have gotten people into trouble. Some of these challenges have been blamed for starting fires.
TikTok has announced a family safety mode, which allows parents to control the content their kids are watching. It also allows them to link accounts and limit direct messaging and content restrictions for their kids. TikTok will also let parents set limits for their teens, such as how long they can play games on their phones.
TikTok has also had trouble dealing with sexual predators, according to a Buzzfeed report. An investigation found that TikTok was failing to take steps to protect its users from sexual predators. Many of the predators on TikTok were older men, who left creepy messages on the app. Users began flocking to other social media platforms to report these predators to other users.
A new report from the UK’s Information Commissioner has raised questions about the safety of TikTok’s data. According to the report, TikTok may have violated rules that govern data privacy for children. The commissioner told a parliamentary committee Tuesday that TikTok’s data could have been misused by the company. The commissioner also warned that TikTok could face up to $22.6 million in fines if it is found at fault. TikTok has responded to the concerns by stating that it is “deeply committed” to protecting its users and that it has “zero tolerance for child abuse or sexual exploitation.”
The Guardian reached out to TikTok for comment.