Free

UK tells platforms to 'tame algorithms' to protect kids

By Paul Sandle
Updated May 8 2024 - 8:35pm, first published 8:30pm
Regulators in the UK want social media platforms to filter out harmful content to protect children. (AP PHOTO)
Regulators in the UK want social media platforms to filter out harmful content to protect children. (AP PHOTO)

Social media platforms such as Facebook, Instagram and TikTok will have to "tame" their algorithms to filter out or downgrade harmful material to help protect children under proposed British measures.

The plan published by regulator Ofcom on Wednesday is one of more than 40 practical steps tech companies will need to implement under Britain's Online Safety Act, which became law in October.

The platforms must also have robust age checks to prevent children from seeing harmful content linked to suicide, self-harm and pornography, the regulator said.

Ofcom chief executive Melanie Dawes said children's experiences online had been blighted by harmful content they could not avoid or control.

"In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms," she said.

"They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that's right for their age."

Social media companies use complex algorithms to prioritise content and keep users engaged.

However, the fact that they amplify similar content can lead to children being influenced by increasing amounts of harmful material.

Technology Secretary Michelle Donelan said introducing the kind of age checks that young people experienced in the real world and addressing algorithms would bring about a fundamental change in how children in Britain experienced the online world.

"To platforms, my message is engage with us and prepare," she said.

"Do not wait for enforcement and hefty fines - step up to meet your responsibilities and act now."

Ofcom said it expected to publish its final Children's Safety Codes of Practice within a year, following a consultation period that ends on July 17.

Once it is approved by parliament, the regulator said it would start enforcement that would be backed by action including fines for non-compliance.

Australian Associated Press