HomeTechUK regulator demands tech firms shield children from harmful content - Times...

UK regulator demands tech firms shield children from harmful content – Times of India

Date:

Related stories

Revealed: Government plans for finding alien life

The government is officially gearing up for the discovery...

Shaffy Yaqubi Offers Solutions to Bridge UK Healthcare Gaps

Shaffy Yaqubi, one of the seasoned professionals in the...

Tata Steel’s UK business likely to make operating profit in H2, says CEO T V Narendran

Tata Steel's loss-making operations in UK will likely make...

Renewi offloads UK local waste treatment business to Biffa

European recycling group Renewi is effectively paying waste management...
spot_imgspot_img

LONDON: Social media platforms will be hit by fines of up to $22.5 million unless they take action to ensure their algorithms do not direct children towards harmful content, the UK communications regulator Ofcom said Wednesday.
A new British online safety law imposes new legal responsibilities on platforms to protect children, and Ofcom published a draft code of practice that establishes how they should meet them.
“In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes.
“They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age,” she added.
The report outlines 40 practical measures that will deliver a “step-change in online safety for children”, said the regulatory chief.
“Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account,” she warned.
The new measures are due to come into force next year, with rule-breakers facing fines of up to £18 million ($22.5 million) or 10 percent of their revenue.
Dawes said rogue platforms would be “named and shamed”, and that they could even be banned for children.
As well as robust age-checks, the code requires firms to put in place content moderation systems and ensure harmful content is removed quickly.
Peter Wanless, chief executive of children’s charity the NSPCC, called the draft code a “welcome step in the right direction”.
“Tech companies will be legally required to make sure their platforms are fundamentally safe by design for children when the final code comes into effect,” he added.

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories

spot_img