Online platforms have banded together to raise concerns about the pending collapse of EU regulations relating to the monitoring and reporting of child sexual abuse content, with the European Parliament and European Council unable to come to an agreement on how to manage the process.
Under the current regulations set out in the EU ePrivacy Directive, online platforms are allowed to scan their services for CSAM material. Some view this as a violation of a users’ right to privacy, because without due cause, this type of scanning can lead to expanded monitoring, which runs counter to the EU push for enhanced privacy.
As a result, regulatory groups have failed to come to an agreement on whether platforms should be allowed to continue to voluntarily monitor their platforms in this way. The current laws are set to expire on April 3, at which time the systems that platforms have been using to monitor CSAM will end.
That has prompted a coalition of online platforms to call for a solution that would avoid restricting their CSAM detection efforts.
In a joint statement on March 19, Google, LinkedIn, Snapchat, Meta, Microsoft and TikTok called on EU regulators to establish a better way forward, or agree to an extension of the current regulations.
As per the statement: “As technology companies, we are deeply concerned about the breakdown in EU negotiations to secure the continued protection of minors against child sexual abuse. Allowing the legal basis of the ePrivacy derogation to expire on April 3, in place since 2021, is irresponsible. It must be extended.”
Failure to act would “reduce the legal clarity that has enabled companies for nearly 20 years to voluntarily detect and report known child sexual abuse material (CSAM) in interpersonal communication services,” per the statement. In addition, the companies said it would leave children around the world with fewer protections.
“Voluntary detection of CSAM through hash matching is an established tool central to law enforcement investigations,” the statement said, adding that “[l]ong-standing industry-wide hash matching utilizes irreversible digital fingerprinting to identify known CSAM. By matching these unique hashes against a secure database of previously identified material, the system ensures high-precision detection while adhering to privacy principles.”
The platforms urged EU lawmakers to “swiftly agree” on a way forward for voluntary CSAM detection, noting that “failure to do so would be irresponsible.”
Last Monday, Euractiv reported that the EU Council proposed an extension of the current rules for another two years, in order to allow more time for a permanent framework to be finalized. However, the proposal failed to gain the required agreement.
And with limited time remaining, EU regulators need to act quickly to establish a way forward before the current systems are disallowed.