The Independent Inquiry into Child Sexual Abuse (IICSA) published its latest report on 12 March 2020. The report investigated how the internet has facilitated the online grooming and abuse of children, and importantly what can be done to stop this from happening in the future. The Inquiry heard three weeks of evidence from law enforcement, the internet companies and also from people that had been groomed and abused online. Switalskis were the only Solicitors representing victims of abuse.
The report noted that the scale of online facilitated abuse is enormous, with millions of indecent images of children in circulation. An eye watering statistic provided in evidence to the Inquiry was that BT alone found that there were over 30,000 attempts to view child sexual abuse material online every 24 hours. What is particularly alarming is that the vast majority of images are not on the so called ‘dark web’, but in fact are on services such as Facebook, Instagram and Snapchat. In addition the NSPCC estimated that around 500,000 men in the UK have at some point used or viewed child sexual abuse images. The National Police Chief Lead for Child Protection and Abuse, Chief Constable Simon Bailey, told the Inquiry that there was an increase in reporting and also that the seriousness of the images being viewed was getting worse. He spoke of how babies were sometimes the subject of abuse in some images. He was clear that it was simply not possible to arrest everyone and that something had to be done to stop these images from circulating in the first place.
The Inquiry report was critical of the technology industry, saying that at times their response was reactive and seemingly motivated by the desire to avoid reputational damage. It was said that the transparency reports prepared by the technology companies did not provide the full picture. The overall impression that Switalskis formed was that because technology moved so quickly new services were simply released without consideration being given to their wider impact, because to do so would hand competitors an advantage.
There was a particular focus on age verification. Many of the technology companies which provided evidence said that they had a policy whereby users must be at least 13 years of age. However, what became clear was that in fact there was nothing to prevent a user lying about their age. So, an 11 year old can claim to be 16, and equally worrying, a 40 year old can claim to be a 20 year old. The effect of this is that often it is not possible to know with certainty who it is that a child is communicating with.
It also became clear in the course of evidence that the technology companies were not pre-screening material before it was uploaded to their websites. While the material was screened once it was on their website, this allowed for child sexual abuse images to be shared and circulated before they were removed. No clear reason was given for why images were not checked before they were uploaded even though it was possible for this to be done.
The Police were also particularly concerned about the increased move towards encryption, particularly end to end encryption. There are obvious advantages in terms of privacy and security for encryption, but there were also concerns from the Police who said that it makes their task more difficult when investigating online child abuse offences. This leaves a gap which can be exploited by those wishing to view and distribute child abuse images online. The Inquiry was not able to provide any immediate recommendation or answers to this problem but it is clear that a solution needs to be introduced which respects privacy but which also allows the Police the power to access communication which may assist in detecting and preventing crime.
There was, inevitably, a significant focus on technological solutions. What is probably not as widely recognised is that there is an army of human moderators who review content online to check whether it is child abuse or not, before removing it. Facebook, for example, employ over 15,000 moderators. What was not clear from the Inquiry was whether that is enough, and the Inquiry concluded that the technology companies need to understand much better the scale of the problem. The inquiry said that only once the scale of the problem is known can the resources be known to be adequate, or not as the case may be.
The Inquiry made 4 key recommendations;
- The Government should require industry to pre-screen material before it is uploaded to the internet;
- The Government should lead international cooperation to ensure that those countries hosting indecent images of children take stronger action to have them removed;
- The Government should introduce legislation to require more stringent tests to ensure that users are the age they say they are;
- The Government should publish the interim code of practice in respect of child sexual abuse as proposed by the online harms paper.
These are all sensible recommendations which should help to reduce the scale of the problem. However, there were no recommendations which would have held the technology companies to account for the damage that is caused to victims of this abuse. We asked the Inquiry to make the internet companies accountable by either paying compensation or introducing a levy on income to ensure that victims of abuse receive the help and compensation they deserve. There was a recommendation that the Criminal Injury Compensation Scheme should be reconsidered to ensure that these offences are brought within the definition of ‘crime of violence’, but that is a taxpayer funded scheme and it would have been better if the Inquiry had concluded that it should have been the multi-billion dollar internet companies that should be accountable and responsible for such payments.