A new initiative aims to stop the rise in child sex abuse online.

Police report an 84% increase in online grooming crimes, which is why Natterhub is launching its #HaveTheConversation campaign.

initiative aims to stop the rise in child sex abuse online.

Source: Google image

Main Highlights:

Natterhub, an online safety platform, has started a campaign to combat the growing problem of child sex abuse online.

Following a report from the Internet Watch Foundation (IWF) stating that instances of abuse imagery captured via a webcam have increased by 374% since the pandemic, the initiative urges parents and schools to “Have the conversation before someone else does.”

Source: Google image

Natterhub: What is it?

Natterhub is a secure, password-protected social media site.

Natterhub, which was developed with online safety in mind, has more than 350 interactive lessons to teach students everything they need to know about using screens, how to cultivate positive screen behaviours, manage well-being, and develop digital resilience. Children between the ages of 1-6 can use Natterhub.

Natterhub’s weekly lessons and assessment programme offer a thorough, enjoyable, interactive solution for your school’s Online Safety, Media Literacy, and Digital Intelligence Development, with a scheme of work aligned to the entire RSE curriculum and the UKCIS Education for a Connected World.

With the help of our cutting-edge social media-style platform, students can interact with the lessons, badge system, and quizzes on fake news.

The IWF took action to remove 38,424 URLs in 2019 that contained images or videos of “self-generated” content; by 2021, that number had increased to 182,281.

These images of child sexual abuse now make up nearly a third (32%) of all the online content the IWF works to remove.

The figures provided by the charity show that young girls are particularly at risk. They made up 60% of the kids pictured in pictures of child sexual abuse in 2011. That number is currently 97%.

The number of online grooming crimes reported by the police has increased by 84% in four years, according to statistics released by the NSPCC in July of this year, which show a similarly alarming rise in internet-based abuse.

This situation ought to be completely avoidable if we can help children have timely and interesting conversations. IWF Susie Hargreaves

Susie Hargreaves, CEO of IWF, which is supporting Natterhub’s initiative, said, “Child sexual abuse, which is facilitated and captured by technology using an internet connection, does not require the abuser to be physically present, and most often takes place when the child is in their bedroom – a supposedly safe space in the family home.”

“This situation should be completely avoidable if we can encourage timely and interesting conversations with children, at school and at home.”

Source: Google image

#HaveTheConversation campaign.

Thus, the campaign and its collection of sobering, true-life examples about online sharing of user-generated content.

Additionally, the platform is asking schools to direct parents of students to a free online workshop on online safety that will be held on October 6 at 7 p.m. during which they can learn more about the #HaveTheConversation campaign.

As we mentioned in our previous article, Natterhub has started a programme called Cybersmart in Seven Minutes to help give primary school-aged kids the tools and skills they need to navigate the internet safely.

Manjit Sareen, co-founder and CEO of Natterhub, stated that “the reality is that parents are unaware [that potential abuse is happening in their own homes] and children are not equipped with the knowledge and support they need to navigate these situations.”

Considering the increase in this kind of activity, these discussions must take place immediately.

The new regulations will assist in protecting children from additional abuse, preventing the re-publication of offensive content online, and prosecuting offenders.

These laws will comprise:

Providers of hosting or interpersonal communication services: must evaluate the risk that their services will be abused to spread information about child sexual abuse or to groom children. They must also take steps to reduce that risk. Additionally, providers will need to suggest ways to reduce risk.

Obligations for targeted detection, based on a detection order: the Member States must name national agencies in charge of examining the risk assessment. A court or independent national authority may issue a detention order for new or known child sexual abuse material or grooming if these authorities determine that there is still a significant risk.

Orders for detection are time-limited and focus on a particular category of content on a particular service.

Clear reporting obligations: Providers who detect online child sexual abuse must report it to the EU Centre.

Effective removal: If the child sexual abuse content is not removed right away, national authorities may issue removal orders. Additionally, internet service providers will be required to block access to pictures and videos that cannot be removed, for example, because they are hosted outside of the EU in nations that are unwilling to cooperate.

Strong detection safeguards: Businesses that have been issued a detection order are only permitted to identify content using indicators of child sexual abuse that have been confirmed and supplied by the EU Centre. Only detecting child sexual abuse is permitted with the use of detection technologies.

According to the current state of the art in the industry, providers must use technologies that are least intrusive to privacy and that minimise the error rate of false positives.

Reducing exposure to grooming: According to the regulations, app stores must make sure that children cannot download apps that pose a significant risk of child abuse or solicitation.

Strong regulatory framework and legal recourse: Courts or independent national authorities will issue detection orders.

The EU Centre will confirm reports of potential online child sex abuse made by providers before passing them along to law enforcement officials and Europol in order to reduce the risk of inaccurate abuse or detection and reporting. The right to appeal any decision that affects them in court will be guaranteed to both providers and users.