Lax security at TikTok puts children's privacy, safety at risk: report

There was a serious lack of security measures at TikTok, which affected users who posted videos which were moderated by the social media company’s Dutch and German teams, NRC reports. The office was notorious for drug and alcohol use by moderators on duty, and their supervisor, and private information including deleted videos were easily accessible at their WeWork shared space in Berlin.

Friends and family of the moderators were also welcome at the office and were allowed to screen the private user data with little or no supervision, according to the newspaper.

“I would never allow my child to use TikTok,” one former employee told NRC. There are some 700 thousand Netherlands users on TikTok, which focuses on children from 10 to 12 years of age.

One former staffer told NRC that moderating the grooming of potential potential victims of pedophiles has become a rampant issue. A mounting workload and growing internal pressure has made it harder and harder for moderators to block contact between adults acting inappropriately, or possibly unlawfully, and the site’s child users whose accounts are public by default. Temporary bans are possible, but permanent bans and deleting accounts is a decision ultimately made at TikTok headquarters in China.

TikTok told NRC said it was addressing some of those issues at a new office in Dublin, dubbed the Trust and Safety Center. Protecting its young user base “has the highest priority,” the platform said. Its inadequate approach has drawn the ire of Watch Nederland, an organization that works to stop child sexual abuse, assault and exploitation, as well as law enforcement in Belgium.

The platform did not state if staffing increases and changes were necessary to better address the issue. “TikTok does not tolerate sexually inappropriate behavior toward children or grooming,” the platform stated, according to the newspaper.

Censorship has also been a problem on the platform, with the videos from users with mental or physical disabilities frequently blocked or hidden from the public, NRC wrote. Videos posted about Sinterklaas, Zwarte Piet, or blackface could only be seen in the Netherlands, for example.

Videos relevant to the LGBT community were also heavily restricted geographically, even if a video user was simply wearing rainbow-colored jewellery.

TikTok said it used to hide users from the public who they deemed were more at-risk for bullying. They said their policy was blunt in the past, but has been revised.

Owned by Chinese firm Bytedance, the former workers allege that the security problems developed from a fortune-seeker mentality that pushed user growth over safety and security, according to NRC. Promised exciting jobs with developing career opportunities, employees ended up reviewing about a thousand videos a day choosing to either allow them to be viewed publicly, hiding them from everyone but the uploader, or regionally censoring them.