TikTok acts on unhealthy eating videos

Image source, Getty Images

Image caption, TikTok has faced scrutiny over how some of its short videos affect younger viewers' wellbeing

TikTok has announced changes to its community guidelines, aimed at cracking down on content promoting "disordered eating".

It will remove videos which encourage over-exercising and short-term fasting.

It will also expand its system that detects and removes videos that include adult nudity, illegal activities or risk the safety of minors.

The changes attempt to address concerns raised by politicians and regulators.

Social media platforms have been under increased scrutiny over their approaches to the wellbeing and safety of younger users in recent months.

The UK government plans to introduce new legislation - The Online Safety Bill - which would issue huge fines to social media platforms which are not deemed to be doing enough to crack down on harmful content.

And the Chinese-owned video app faced questions in the US in October over platform safety. Senators at the hearings suggested that eating disorder content was prevalent on the platform.

Eating patterns

In September, TikTok reported that around one billion people were using the app each month.

It said that of the 91 million videos removed during the third quarter of 2021, 88% were caught before being watched by viewers.

While hateful ideologies such as misgendering, misogyny and content which promotes conversion therapies were already prohibited, TikTok says it has now added further clarity to its community guidelines to make this clearer.

TikTok says it has expanded its approach and will now target videos that promote broader disordered-eating content as well.

"We understand that people can struggle with unhealthy eating patterns and behaviour without having an eating disorder diagnosis," the company said in a post on its website.

"Our aim is to acknowledge more symptoms, such as over-exercise or short-term fasting, that are frequently under-recognised signs of a potential problem."

TikTok said it is also developing a system to identify and restrict certain types of content from being accessed by its teenage users.

The company is currently testing ways in which its users can tag their own content, depending on the age of the audience it's aimed at.

The company will open cyber-incident monitoring and investigative response centres in Washington, Dublin and Singapore this year.

The move forms part of its expanded effort to prohibit unauthorised access to TikTok content, accounts, systems and data.