Urgent action has been called for to help stop sex offenders using social media to communicate with children, as new figures show a frightening rise in cases across Cumbria and the North East.

The NSPCC has obtained new figures revealing that police forces across the North East and Cumbria recorded 836 offences of sexual communication with a child between April 2017 and October last year.

The NSPCC has warned that the number of offences is “accelerating.”

Across England and Wales, more than 10,000 offences of sexual communication with a child were recorded since the offence was brought into law in April 2017, following an NSPCC campaign to give police the power to intervene in cases of online grooming.

While the most recent figures obtained by the NSPCC are for October 2019, the charity is stressing that the coronavirus lockdown could have led to a sharp increase in this offence, given the even more central role technology plays in the day-to-day lives of children.

The Government had in February committed to publishing an Online Harms Bill, but the NSPCC now says "frustration is growing" at delays in its publication, and is concerned the introduction of an online regulator created by the bill will not arrive until 2023.

The NSPCC’s chief executive Peter Wanless had scathing words for the social media industry, calling on the Prime Minister to “stand up to Silicon Valley” by ensuring the passage of an Online Harms bill through Parliament within the next 18 months.

“Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids,” he said.

“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”

Based on figures obtained by the NSPCC, Facebook-owned apps were used in 55 per cent of the recorded cases of sexual communication with a child, from April 2017 to October 2019. This includes Facebook Messenger, WhatsApp and Instagram.

The NSPCC has a number of criteria it wants to see with the new law.

One is to enforce a "duty of care" on technology companies to "identify and mitigate reasonably foreseeable risks on their platforms... to proactively protect users from harm."

The charity also wants to see the creation of an independent regulator with the powers to hand out fines for breaches of up to four per cent of a company's global turnover, and the power to hold named directors criminally accountable for the most serious breaches of their duty of care.