The government has claimed that its newly introduced Online Safety Bill will make the UK “the safest place in the world to be online”, but some have criticised the bill, warning that it doesn’t go far enough to combat things like cyber-flashing, child abuse or violence against women and girls.
The BBC reported that MPs said the bill’s definition of illegal content must be re-framed, and more should be done to define the risk around activities that fall below the threshold of criminality but still form part of the sequence for online abuse.
They also cautioned that violent images, including child abuse, could still be shared through “breadcrumbing” – or the practice of carefully editing images to subvert content moderation – so pictures stay online.
Just a few weeks ago, the EU announced plans to make tech giants adhere to more formal rules, rather than the previously voluntary agreements, when it comes to detecting, reporting and removing child sexual abuse from its platforms. In fact, Comparitech recently released some updated research showing just how large the problem has gotten.
The report entitled “The rising tide of child abuse on social media” shows how much content was flagged under “child nudity and sexual exploitation” social media platforms reported in the first three quarters of 2021, compared with previous years.
Among the key findings were:
- Facebook flagged a staggering 55.6 million pieces of content under “child nudity and sexual exploitation”–20 million more than 2020’s overall total of 35.6 million
- Instagram had 5m pieces of content or accounts removed in the first three quarters of 2021
- TikTok rivalled Facebook for the most content removed with 56m
- Overall, reporting of content was up 19m over 2020, with 126m reported in the first three quarters of 2021 compared with 107m in the whole of 2020
“As of now, it’s up to social media platforms and messaging services whether or not they report or follow up on offences,” said Brian Higgins, security specialist at Comparitech. “With the first three months of 2021 already significantly topping previous years for exploitative content, it demonstrates a very large, potentially out of control problem. And because these companies aren’t mandated to notify or send data to prosecutors or police in the country an offender originates from, it places an enormous amount of pressure on not-for-profit organisations like the Internet Watch Foundation or NCMEC, who typically deal with these cases and are at risk of being severely under-resourced to deal with the scale of the problem.”
The UK’s Online Safety Bill is in draft and an update is expected within a few months.