Bringing Light to the Darkness of Human Trafficking (Trafficking in Persons)

child social media
Moves for criminal sanctions of social media firms as online sex crimes against Scots children surpass five a day

MOVES are being made for criminal prosecutions to regulate social media as official figures revealed online sex crimes recorded against children in Scotland during lockdown have surpassed five a day.

Child protection charity NSPCC has laid out six tests the Government’s regulation of social media will be judged on if it is to achieve bold and lasting protections for children online - including holding tech firms such as Facebook, Twitter and Instagram to account with criminal and financial sanctions.

The charity says that government proposals will see companies set their own rules on legal but harmful content but the NSPCC says that is not good enough.

It says the law must "compel" firms to respond to the harms caused by algorithms targeting damaging suicide and self-harm posts at children.

It said: "The danger of harmful content should rightly be balanced against freedom of expression, but focus on the risk to children."

The report comes as Scottish police figures revealed a 20% rise in online sex crimes against children in Scotland between April and June.

There were 530 crimes recorded in the first quarter of this year compared with 438 in the same quarter in 2019.

The NSPCC fear say police-recorded crime statistics only show us the number of reported crimes and do not show "the whole picture", so the true scale of abuse is expected to be much higher.

The charity’s How To Win The Wild West Web report, released today, sets out how the upcoming Online Harms Bill must set the global standard in protecting children on the web.

In expressing concern over a surge in online sex crimes involving children in Scotland, it urged the UK Government to ensure it levels the playing field for children, and new laws finally force tech firms to tackle the avoidable harm caused by their sites.

The pandemic is likely to result in long-term changes to the online child abuse threat, with high-risk live-streaming and video chat becoming more popular, the campaign group says. Changes to working patterns, meaning more offenders working at home, could result in a greater demand for sexual abuse images and increased opportunities for grooming.

The NSPCC has routinely highlighted the growing levels of abuse and harm caused to children on social media platforms, and believes the problem has been exacerbated by the fallout from coronavirus.

At the UK Government’s Hidden Harms summit earlier this year, the Prime Minister signposted his determination to legislate for ambitious regulation that successfully combats child abuse.

But the NSPCC says it is worried the landmark opportunity to change the landscape for children online could be missed if this is not translated by the UK Government into law.

It said the six tests also involve the creation of an "expansive, principles-based duty of care and to put legal but harmful content on an equal footing with illegal material.

It said the regulation should lead to robust transparency and investigatory powers The charity believes, if done correctly, regulation could set a British model that "leads the world in child protection online".

But in a stark warning, NSPCC chief executive Peter Wanless, said that “failing to pass any of the six tests will mean that rather than tech companies paying the cost of their inaction, future generations of children will pay with serious harm and sexual abuse that could have been stopped”.

Mr Wanless added: “Industry inaction is fuelling sex crimes against children and the fallout from coronavirus has heightened the risks of abuse now and in the future.

“The Prime Minister has the chance of a lifetime to change this by coming down on the side of children and families, with urgent regulation that is a bold and ambitious UK plan to truly change the landscape of online child protection.

“The Online Harms Bill must become a Government priority, with unwavering determination to take the opportunity to finally end the avoidable, serious harm children face online because of unaccountable tech firms.”

The six tests are backed by Ian Russell, who has campaigned for regulation since the death of his daughter, Molly, by suicide, after she was targeted with self-harm posts on social media.

Molly, 14, from Harrow, North West London, killed herself in 2017 after viewing graphic content on Facebook-owned Instagram and Pinterest.

Mr Russell, who is due to be made an honorary member of council for the NSPCC this week, said: “Today, I can’t help but wonder why it’s taking so long to introduce effective regulation to prevent the type of harmful social media posts we now know Molly saw, and liked, and saved in the months prior to her death.

“Tech self-regulation has failed and, as I know, it’s failed all too often at great personal cost. Now is the time to establish a regulator to protect those online by introducing proportionate legislation with effective financial and criminal sanctions.

“It is a necessary step forward in trying to reclaim the web for the good it can do and curtail the growing list of harms to be found online.”

In January, Mr Russell said his daughter had entered a “dark rabbit hole of depressive suicidal content” and claimed the algorithms used by some online platforms “push similar content towards you” based on what you have been previously looking at.

Instagram said that, between April and June 2019, it removed 834,000 pieces of content, 77% of which had not been reported by users.

Mr Russell urged parents to speak with their children about what they are viewing online and how they are accessing it.

In May, after Boris Johnson hosted a Hidden Harms summit, the government said representatives from government, law enforcement, victims’ charities, front line practitioners and the private sector "will drive forward action" to support victims of crimes such as domestic abuse, sexual violence, child sexual abuse and modern slavery.

Around £10 million was to be given to boost the National Crime Agency’s ability to tackle paedophiles operating on the dark web and a further £3.36 million was allocated to projects to understand the threat posed by the most serious criminals.

This followed what the UK government described as an "unprecedented" £76m in extra funding for vulnerable people from the government’s £750 million package of support for charities.

This included £34.1 million to safeguard vulnerable children.

Prime Minister Boris Johnson said after the summit: "I am acutely aware that for some people home is not a safe space, and that coronavirus has brought with it additional dangers.

"Just as I am committed to tackling the virus, we have to support the most vulnerable and keep them safe from harm and exploitation. That is why it is vital that we come together and bring all our collective expertise to ensure we are doing everything we can to support those at risk, and to help them rebuild their lives."

A Home Office spokesman said: "Child abuse is a truly sickening crime and the government shares NSPCC’s concerns around online child safety.

“We have hosted the Hidden Harms summit, collaborated with Five Eyes partners and convened a global conference to drive the response as well as invested in law enforcement and boosted funds to charities, including the NSPCC which received £1.6 million towards its helpline.

“We are firmly committed to making the UK the safest place in the world to be online and will introduce Online Harms legislation when parliamentary time allows."

Article from: