Tech | Visa | Scholarship/School | Info Place

Social media companies must ‘tame toxic algorithms’ under new measures to protect children

British tech companies have been told to “tame toxic algorithms” and take practical steps to keep children safe online.

This all falls in line with Ofcom’s new measures called the Child Safety Code of Practice, which social media sites, apps and search engines must adhere to.

Ofcom is the government-approved regulatory and competition body for the UK broadcasting, telecommunications and postal industries.

One of the first elements listed was age checks, with “increased use of efficient age guarantees” needed. Any content that promotes suicide, self-harm, eating disorders, or pornography is classified as harmful.

Dangerous challenges, harmful material, incitement to hatred towards people with certain characteristics, instructions for serious violence and real or serious violence against people or animals are also classified as harmful behavior under the UK Online Safety Act.

This should affect all services that do not currently ban harmful content, as they will now need to implement age checks to prevent children from seeing it.

Dame Melanie Dawes, chief executive of Ofcom, explained how the move goes “well beyond current industry standards” but aims to “bring about significant change to the online safety of UK children.

“We want children to enjoy life online. But for too long, their experience has been shaped by seriously harmful content that they cannot avoid or control. Many parents are frustrated and worried about how to keep their children safe. This has to change.”

Regulators “will not hesitate to use the full range of our enforcement powers to hold platforms accountable.”

The draft includes measures to ensure strong accountability for child safety is also held within tech companies. They said this should include the designation of a designated person specifically responsible for adhering to child safety duties.

OnlyFans is under investigation by Ofcom

The measures come just days after Ofcom announced an investigation into OnlyFans on 1 May.

They are looking into whether the company is doing enough to prevent children from accessing pornographic content on its site.

While OnlyFans does have age measures in place, Ofcom said they “reviewed the submissions we received” and had “reasons to suspect that the platform has not implemented age verification measures to adequately protect under-18s from sexually explicit material” .

An update on the investigation is expected to be released in due course.

#Social #media #companies #tame #toxic #algorithms #measures #protect #children

Leave a Reply

Your email address will not be published. Required fields are marked *