Social media platforms like Facebook, Instagram and TikTok will have to “tame” their algorithms to filter out or downgrade harmful material to help protect children under recently published proposed British measures.
The plan by regulator Ofcom is one of more than 40 practical steps tech companies will need to implement under Britain’s Online Safety Act, which became law in October.
The platforms must also have robust age checks to prevent children seeing harmful content linked to suicide...