Childnet is opposed to Meta’s move to make Messenger conversations end-to-end encrypted (E2EE), as well as the intention to make the same step to Instagram direct messaging. The intention was outlined by Meta several years ago and it is controversial, even though E2EE services are popular, valued and provide high levels of privacy to their users.
Meta have been busy in developing tools to make the move of these two services as safe as possible, and these measures will undoubtedly have an impact as there are additional ways to detect bad actors even on E2EE services and existing tools remain available to users to block and report for example. However, to move an existing popular service into an E2EE environment, it is essential that safety is at least matched in the new environment, ideally improved, and our concern is that this shift falls below the necessary mark. At Childnet, we are against moving these popular services to the E2EE environment as it weakens a key child protection mechanism relating to the dissemination of known Child Sexual Abuse Images (CSAM).
After the switch to E2EE, Meta will no longer be able to scan messages sent between users using these two services for known images of CSAM. Meta’s ability to be alerted to this activity on their platforms has led to millions of reports by Meta to law enforcement of illegal image/video sharing and this is at risk in the new E2EE environment.
There are several reasons this ability to scan for such content is important to retain, not least for the resulting reports to law enforcement enabling the identification, rescue and support of victims and the apprehension of their abusers. Scanning for such content also makes possible supporting the victims and survivors of online child sexual abuse by combatting the circulation and re-circulation of the visual record of their own abuse and its resulting re-victimisation. Any steps to prevent the wider circulation of this known illegal content must be taken.
Vital new services have developed over recent years to do just this, to help child and adult victims concerned about existing indecent/illegal images of themselves being circulated on online platforms. Three brilliant services – Report Remove, Take It Down and Stop NCII – work in different ways to enable such victims to take steps towards closure, enabling online services to scan, detect and prevent the circulation of this known abusive content. Meta has been a key player in the development of these important services and understands well the value of the support they provide, as these services offer a vital degree of control to some extremely vulnerable people. The problem is that these services don’t work in E2EE environments, and thus will no longer work on Messenger and Insta DMs.