HUNDREDS of online sexual messages have been sent to children in the North-East since a law was introduced making it illegal.

The NSPCC says police forces in the North-East and Cumbria recorded 836 offences of sexual communication with a child between April 2017 and October 2019 with 23.4 per cent of those taking place in the six months up to October last year.

The charity has warned that there could be a sharper increase this year due to the "unique threats" caused by coronavirus and is calling for a new law putting a duty of care on tech companies like Facebook and Whatsapp.

Chief executive Peter Wanless said: “Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids.

“Last week the Prime Minister signalled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety. He can do this by committing to an Online Harms Bill that puts a legal Duty of Care on big tech to proactively identify and manage safety risks.

“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”

Nationally, 10,119 offences of sexual communication with a child were recorded by police in England and Wales in the two and a half years since the law came into force.

The NSPCC says Facebook-owned apps were used in 55 per cent of cases, where police recorded information about how a child was groomed.

Where the means of communication were provided, there were more than 3,200 instances of Facebook-owned apps (Facebook, Facebook Messenger, Instagram and WhatsApp) being used, of which half involved Instagram. Snapchat was used over 1,060 times.

Emily, whose name has been changed to protect her identity, was 13 when she exchanged messages and photos with a man she believed to be 15 on Facebook and Snapchat. The man turned out to be 24 and sexually abused her.

Her mum said: “It’s important for social media to be regulated and for Facebook and Instagram to take more responsibility to keep the people who use their platform safe.

"All other businesses have a duty of care to keep children safe, so why not them?”

What the NSPCC says the Online Harms Bill should do:

• Enforce a Duty of Care on tech companies to identify and mitigate reasonably foreseeable risks on their platforms, including at the design stage, to proactively protect users from harm

• Create a regulator that can hand out GDPR equivalent fines – up to of 4% of global turnover - and hold named directors criminally accountable for the most serious breaches of their Duty of Care

• Give the regulator robust powers to investigate companies and request information

• Create a culture of transparency by legally compelling tech firms to disclose any breaches of the Duty of Care and major design changes to their platforms