Switalskis Solicitors have recently represented three victims of on-line sexual abuse at the Independent Inquiry into Child Sexual Abuse (IICSA). The inquiry was set up in 2015 to investigate institutional failures of sexual abuse of children and in May 2019 there was a two week hearing where evidence was heard about how the internet is used to facilitate grooming and abuse of children.
The inquiry heard evidence from our client’s, known as A1, A2 and A3, about the abuse that they suffered on-line and also from the mother of two of them. They told the Inquiry about how they had been groomed while using the internet and how this lead to more serious offences taking place. They talked about the impact that this had on them and their family and were also asked about what suggestions they had for improving child safety on-line.
This provided the backdrop for the Inquiry to take evidence from the large technology companies such as Apple, Facebook and Google. They were questioned over four days about the systems they have in place to protect children, the resources they spend on this and how they work with law enforcement to help trace and prosecute offenders. In the second week of the investigation evidence was taken from the National Crime Agency, as well as the Internet Watch Foundation and other law enforcement agencies.
In representing victims of on-line abuse it was our task to hold all those giving evidence to account and to identify why there has been such a large increase in on-line grooming and abuse and what can be done to better protect children.
We identified that there was no true understanding of how bad the problem was. While all of the available data suggested that there were significant increases year on year, there was no accurate data from any of the technology companies to identify what the actual scale of the problem was. We identified that the lack of reliable information made the task of knowing how best to tackle the problem very difficult.
We exposed that there was a tension between the technology companies desire to create and develop new products, markets and customers as quickly as possible on the one hand, and the need to create safe platforms on the other. This was most clearly highlighted when Facebook told the Inquiry that they have a strict policy of not allowing anyone aged under 13 to have a Facebook account, but under closer scrutiny it was found that there was nothing in reality to prevent a child under that age saying they were 13 and creating an account.
We also established that despite many of the largest technology companies measuring their profits in the tens of billions they did not have a dedicated budget to fight on-line abuse on their platforms. The National Police Chief Council suggested that all technology companies should ring fence a proportion of their profits to pay directly for on-line safety measures and also for therapy and counselling for those affected by such abuse.
We formed the impression throughout the Inquiry that the priority given by the large technology companies to on-line child sexual abuse was too low. The impression was they were most concerned with increasing the number of their users as quickly as possible. We felt it was not in the wider interests of the public for social media platforms to create huge unregulated networks of individuals throughout the world when there is such a large risk that children will be exposed to paedophiles looking to groom and exploit them. It was also of concern that they were increasingly using end-to-end encryption for their services which was said to be a significant problem for law enforcement when investigating complaints of grooming and sexual abuse.
Our view was that too much reliance was placed on self-regulation and artificial intelligence to solve the problem. We felt that too few people and resources were deployed to tackle such a large problem that has devastating consequences for those affected.
We suggested to the Inquiry that the time had come to regulate the internet, specifically when dealing with child protection, grooming and indecent images of children. There is a wider debate about regulating the internet in terms of, for example, free speech and privacy. But that is not a debate which needs to be had about obviously illegal activity. Quite simply the internet companies need to do much more to prevent their platforms being abused, and since self regulation has failed they must be forced to act by legislation and regulation. We submitted to the Inquiry that one important aspect to any regulation was that safety must be central to all internet companies and that they should not be able to operate without tried and tested technology at the very heart of all that they do. Like any company setting up a business that will attract children they should be required to carry out risk assessments of their business. There must also be significantly more information available about what is happening on the internet and the risks that are posed to children. Finally, systems must be put in place to verify the age of all users; it will be impossible to protect children if we cannot at the very least know whether an adult is communicating with a child.
We also told the Inquiry that there should be civil liability for the technology companies to keep their platforms and the services they provide safe. It is not just that this will provide victims with financial redress they deserve, but that financial consequences will cause the industry to place greater investment in their systems. Unfortunately there is now so much money to be made that there needs to be a major overhaul of the way in which the internet is regulated and that must in part include financial incentives.
The Inquiry will consider all submissions and report back with recommendations in the first quarter of 2020.
Anyone that has been the victim of online abuse and wishes to speak in confidence should call 01924 882000 and ask to speak with David Greenwood or Kieran Chatterton