Hollywood actor Ashton Kutcher sees the protection of children in the EU at risk – and has an urgent appeal. He would like the filters that have been used to track down photos and videos of abused children on the Internet to continue to be used – despite data protection concerns. “These children who are abused, who are sexually abused, and whose content is spreading on the Internet, they also deserve privacy,” Kutcher said in a recent video on Twitter. In 2012 he co-founded the Thorn Foundation for the protection of children.
Voluntary scanning for depictions of abused children
So far, some US companies such as Facebook, Microsoft or Google have voluntarily scanned messages sent via their services for depictions of abused children. A type of digital fingerprint is searched for, with which already known photos and videos are provided. But the use of filters could soon be banned in the EU.
The EU states have to implement the new code for electronic communication by December 21st. This defines, among other things, which services are covered by digital confidentiality. In the future, these will also be the Facebook Messenger or Google’s e-mail program. Facebook, for example, would then no longer be able to scan its messenger users’ communications with the Photo-DNA program as it did before.
Facebook and Co. are a great help
EU Interior Commissioner Ylva Johansson therefore warns that there will then no longer be any obstacles for pedophiles to upload and share pictures, as she told the German Press Agency. That is why it proposed a temporary solution in September that would allow filtering to continue.
Because Facebook and Co. are a great help for investigators. According to the Federal Criminal Police Office, “most of the references to files with child pornographic content” come from the US Center for Missing and Exploited Children. This works “with American Internet providers and service providers such as Facebook, Microsoft, Yahoo or Google, who use the latest filter technologies to permanently scan their databases and the data distributed via their services for images of abuse.” 2019 be like that the BKA received more than 62,000 reports, which would have resulted in 21,600 cases.
Filter use for another five years
Facebook claims that it uses the Photo-DNA program in all of its apps “to find known child abuse material and quickly delete it,” a spokesman said when asked.
So will investigators soon have to forego information from US corporations because of the digital secrecy of letters? Not if the EU Commission has its way. The Brussels authority wants companies to be able to use their filters for another five years. In addition, so-called “grooming” should be tracked down – that is, the approach of adults to children via the Internet. “In my opinion, as adults we have an obligation to protect children from sexual exploitation online,” says Johansson. The EU states agreed, among other things, to temporarily allow the use of the filters. But the European Parliament also has to agree – and has reservations.
Temporary solution – without “grooming”
In principle, the SPD MP Birgit Sippel, who is in charge of the issue in the Interior Committee, is in favor of an interim solution. Child sexual abuse is a serious crime that justifies restrictions on other fundamental rights, she says. However, these would have to be legally secure and proportionate. Sippel therefore calls for protective measures such as the possibility of complaining if your own account has been wrongly blocked. In addition, the interim solution should be limited to one year. And the companies would have to report regularly on their work.
In addition, “grooming” should be removed from the law. Because not only would digital fingerprints have to be compared, but all communication between users would have to be read.
Prevention instead of surveillance
The European Data Protection Officer also has concerns about the proposal of the EU Commission. And Alexander Hanff, himself a victim of abuse, is against it: The proposal enables the monitoring of all private communication, he recently wrote on LinkedIn. In addition, there is no evidence that the measures are actually effective and that the activities are not simply being pushed underground, where they are even more difficult to detect. Instead, a lot more preventive work has to be done.
“A badly written law would very likely end up before the European Court of Justice and thus create no legal certainty. And that would not help anyone – not the children, not the parents and not the authorities and providers,” says Sippel.
Child Sacrifice Privacy
EU Commissioner Johansson argues against this: “I will never accept that the privacy of users is more important than the privacy of child victims.” She also defends a possible course of action against “grooming”. Finally, the tools used only looked for certain indicators of possible child abuse.
Julia von Weiler from the child protection organization Innocence in Danger understands the concerns of privacy advocates, as she says. But from their point of view it is completely incomprehensible that existing and tried and tested funds should suddenly become illegal because the legislators were not careful. “We are taking the perhaps naive but efficient position of maintaining the status quo until a permanent solution has been agreed.” The dignity of the already affected child weighs heavily enough for them to say: “These filters are acceptable.”
Obligation to Scan for Child Abuse Content
Because the The EU Commission’s proposal did not come until September, the time for the interim solution is running out. Parliament’s Interior Committee wants to commit to a position by the beginning of December, and the plenum could then vote on it in the middle of the month. Then parliament and EU states still have to agree on a line.
The EU Commission is already working on a permanent solution that will be presented in June 2021. In future, according to Johansson, online services should even be required to scan content for known depictions of child abuse and to report it.