Child Protection vs Privacy. Do ends justify the means?

ABBREVIATIONS.

CSAR – Child Sexual Abuse Regulation

CSAM – Child Sexual Abuse Material

AI – Artificial Intelligence

EU – European Union

ePD – EU Privacy and Electrocommunications Directive 2002

Charter – CHARTER OF FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION

ECHR – European Convention of Human Rights

GDPR – General Data Protection Regulation

ECJ – European Court of Justice

CSEA Directive – Child Sexual Exploitation and Abuse Directive

SMEs – Small and medium-sized enterprises

E2EE – End-to-end encryption

CSS – Client-side scanning

UN – United Nations

I.         Introduction.

In May 2022, the Commission proposed the final version of the Child Sexual Abuse Regulation (henceforth referred to as “CSAR” or “Regulation”) to properly fight against the transmission of online child sexual abuse material (“CSAM”) on the internet. The new proposal triggered a heated discussion among legal scholars and technology experts as it was expected to employ AI-powered tools to detect the unknown CSAM and “solicitation”/”grooming” materials.

On the one hand, the EU is determined to adopt more effective measures to stop the possession, distribution, and hostage of the CSAM in digital platforms and online communication services by the adoption of the new law. On the other hand, privacy advocates argued that the new regulation will have unproportionately far-reaching effects by mass surveillance of private communications that no one will feel safe engaging in online communication from now on. Therefore, the author of this paper throughout the paper will try to answer whether or not the proposed measures are proportionate vis-a-vis its stated aim.

II.         Organisation and Disposition.

In order to answer the main research question, we will use the legal dogmatics method. This method will allow us to study the normative legal material in depth. In other words, we will analyse not only the specific articles of the draft law but go over other important related legal documents to conclude if the new regulation would conflict with any of them.

For these purposes, in Chapter I, we will briefly summarize the main legal documents which should be given importance in the scanning of electronic communications. For the sake of clarity, those legal texts will be presented in accordance with the historical order.

II Chapter will be dedicated to the scope of the new regulation. In this part, we will analyse the main actors of the proposed law who will have to undertake a lot of obligations under the CSA Regulation. This will also enable us to answer if the scope of the defined digital platforms and services is proportionate to the draft law’s stated aim.

In Chapter III, the analysis of content will unpack the issue of false alarms and their likelihood, since more errors mean more intrusion into the private lives of the people unnecessarily.

Chapter IV is where we start analysing the obligations that need to be undertaken by the digital service providers in order to comply with the CSA Regulation. In this part, we will try to be the voice of the lawmakers and try to interpret what those obligations imply in the reality. We will particularly focus on the main articles and try to reveal the problems associated with them.

In Chapter V, we will discuss the effects of the new regulation on end-to-end encryption (“E2EE”) and the following possible effects on society. The conclusion will follow Chapter IV in which we will conclude: given all the shortcomings of the new law, is it still a good idea?

III.         How did the EU end up scanning private communications?

In 2021, the Ministers of the EU Member States agreed with the European Parliament to temporarily adopt a new law to derogate from the 2002 ePrivacy Directive (“ePD”)[1]. The purpose of the exception/derogation from the Directive was to enable the providers of the electronic communications network to scan and filter the private and personal messages of everybody, instead of limiting these activities to certain groups of suspicious people. This new derogation sparked a heated debate among legal scholars as it was going to result in the mass surveillance of online messaging and would pose a great risk to the fundamental human rights of the people by putting them all in a position of suspicion.

Firstly, it was planned that this exception would expire in 2024, and by that time EU would have come up with a permanent solution. Given the increasing number of child sexual abuse material (CSAM) circulating via private messaging and the pressure to prevent this, the EU aimed to create room for the e-communication companies to manoeuvre and detect those contents. Contrary to what was expected, the EU opened the path for the said companies to scan all the messages sent and received in preference to restrict this conduct to certain cases of real legitimate doubt. They even made it clear that the long-term solution was going to make this kind of mass surveillance of electronic communications mandatory for the companies.[2]

However, it seems that the lawmakers failed to take into account the fierce risks the adoption of the new law could pose for the people’s fundamental rights that the EU has always been proud of. What the EU and the Member States were missing was the issue of the compliance of the derogation with the fundamental principles embedded in different sources of the EU Law, in addition to the proportionality question. Therefore, a brief overview of relevant policy documents is considered below.

A.    EU Charter of Fundamental Rights and ePrivacy Directive (2002).

Article 7 of the EU Charter of Fundamental Rights (“Charter”) which correspond to Article 8 of the European Convention of Human Rights (“ECHR”), guarantees everyone the right to respect for private and family life, home and communication.[3] However, the ePrivacy Directive (“ePD”) is the only legal instrument that specifically deals with the privacy of communications.

The adoption of the ePrivacy Directive in 2002 was acknowledged with the appraisal and considered to be an enormous step to safeguard the confidentiality of private electronic conversations and eradicate any possibility of subjecting them to mass surveillance without a legitimate cause.[4] Adoption of the Directive gained the privacy and confidentiality of the communications the real value rather than the vague language used in the Charter and the ECHR. It was important in many ways: firstly, reliance on the protection of their messages enabled some categories of the professions like journalists, doctors and lawyers including human rights defenders to do their job safely; secondly, it led to voting freely, to whistleblow the cases of the bribery and corruption and to chat with friends and family members more conveniently without fearing the content of the messages.[5]

ePrivacy Directive is also complementary to the GDPR in the sense that the metadata in electronic communications falls specifically under the protection provided by the Directive. Accordingly, the processing of such data is only permitted if allowed by the Directive itself, national or EU Law according to Article 15.1 ePD.[6] Furthermore, any processing of electro-communications data beyond its transmission is viewed as a limitation to the confidentiality of the communications under Article 5.1 by the ePrivacy Directive.[7] Also, ECJ has given such restrictions a strict interpretation.[8]

Even so, neither the inclusion of the right to privacy in the EU Charter nor the adoption of the ePrivacy Directive means that there can be no restriction to the said fundamental right. As stated in Article 52.1 of the EU Charter, any restriction adopted with respect to the fundamental rights enriched therein should be legitimate in a democratic society, necessary and proportionate with regard to its stated aim.[9] Similarly, Article 15.1 of the ePrivacy Directive allows the restriction of the confidentiality of communications in accordance with the principle of necessity and proportionality in order to prevent, detect and investigate criminal offences in compliance with the Charter.[10]

However, throughout this paper, we will argue that the draft law cannot meet any of these requirements.

B.    Child Sexual Exploitation and Abuse (CSEA) Directive (2011).

In a different vein, the EU adopted the Child Sexual Exploitation and Abuse (CSEA) Directive addressing the online storage, dissemination and amplification of the CSAM as a vital issue. They once reassured that protecting young people and children is one of the priorities of the EU legislative bodies.[11] However, the first attempt by the EU to step in and regulate the CSAM cannot be considered a success.[12]
A quite limited number of EU Countries have transposed the Directive into their domestic legal system providing strong measures to take down the web pages containing online CSAM under Article 25 of the CSEA.[13] Another investigation conducted in Germany revealed that authorities have failed to take appropriate measures to remove the CSAM even when they have been warned about its presence.[14] It was reported that the photos and videos of child abuse remained to exist in the digital environment for quite a long time after the relevant bodies of Germany acquired the information regarding the existence of such content.[15] The excuse behind this omission was explained by the lack of human resources in the German law enforcement authorities.[16]
With the permanent solution, the EU aims to widen the scope of scanning from storage, and dissemination in an online environment like web pages to each and every chat happening among individuals. However, the question arises that how the EU plans to achieve this with its limited police force which cannot even do the proper job with regard to the comparatively less online CSAM. Furthermore, we could also ask why the EU would not focus on the already existing problem of not removing the CSAM once it has been notified to the relevant authorities rather than expanding the scope to more routes.

C.    The Temporary ePD Derogation (2020-21).

With the introduction of the new changes to the European Electronic Communication Code, the scope of the electronic communications service was expanded. This expansion implied that from December 2020 on, certain provisions of the ePrivacy Directive will also be applied to electronic messengers including Facebook and WhatsApp, also web-mails, and even chats in dating apps.  These new changes to the Code were welcomed since they stopped the electronic communication companies to stick their noses into their users’ private conversations. It followed the logic of real life, as the police cannot also search our belongings without a lawful base or suspicion, these companies were deprived of the opportunity to read all the messages without a reasonable cause.
Following this, the EU faced a new plague of pressure from different groups to adopt the temporary derogation from the ePrivacy Directive to enable private companies to scan the content of electro-communications. The initial plan was the temporary deviation in 2021 followed by the long-term solution of adopting the new CSA Regulation till 2024. Accordingly, the temporary derogation was put into vote and passed in July 2021 allowing companies to carry on with scanning private messages on a voluntary basis. This itself may already sound shocking for people of the EU since it was not known that our messages were being scanned and filtered even before the actual deviation from the ePrivacy Directive came into force.
Now, in the next Chapter, we will turn to the thorough analysis of the final proposal for the Childe Sexual Abuse Regulation of May 2022.

IV. Which providers and services fall under the scope of the CSA Regulation?

The new regulation if passed covers all digital platforms and services. According to Articles 1.1 and 1.2 of the current proposal, the range of services which will be subject to the scanning covers interpersonal communication services such as online and app-based chat services, emails, dating apps and other chat-based services, telephone calls and messages, chat functions of the gaming apps; application stores such as Apple Store and Google Play Stores; hosting platforms and services including information-storage platforms such as cloud infrastructure services, file-sharing services such as WeTransfer and Dropbox, blog and podcasting services, social media services such as Twitter, Linkedin, Facebook, Youtube including the content of the posts shared in these platforms; and internet access providers.[17] All of the said services and platforms can be subject to risk mitigation measures and detection orders under the CSA Regulation.

It is understandable that high-tech companies who will be the main providers under the scope of CSA Regulation can comply with these costly new rules, however, there is no exception with regard to the SMEs who lack sufficient resources. This implies that the new CSA Regulation may also cause the concentration and eventually the decrease of the contestability in the market which is inevitable.

Even though the Regulation itself in its Article 1.1 recognizes the need to tackle the abuse of children in online services by the CSA perpetrators, the inclusion of quite a large number of platforms will affect the lives of many innocent users who are there for legitimate reasons.[18] Therefore, the scope of the CSAR triggers the proportionality questions.

It is reasonable that some of those services may be utilised by the CSAM perpetrators, but it is still open to dispute if most are indeed needed. Take an example of the phone calls and messages: the issuance of the detection orders with respect to the calls literally means wiretapping and as for the messages, detection orders amount to interception. It is not clear how these rules are compliant with the rule of law requirement of individual warrants and investigations as would be carried out in the physical world.[19] Therefore, the author throughout this paper argues that there is a need for additional rules and practices to fight the CSA material in some of these services, however including such a wide range in this proposal is disproportionate with regard to its authentic objective.

It is true that most CSA materials are spread in Europe, however, given that the numbers do not really stipulate the real ones, such a far-reaching regulation cannot be justified on a proportionality basis. It cannot and should not open the path to subject all interpersonal communications to surveillance, therefore we will argue that the proposal fails to meet the criteria of proportionality.

V. What is the content of the CSA Regulation?

The content for which the online providers will be liable has been classified under three forms in the CSA Regulation:

  • “Known” CSAM. According to the Art. 2.m, the “known” CSAM means “potential child sexual abuse material detected using the indicators contained in the database of indicators referred to in Article 44(1), point (a)”.[20] Under Article 36.1 of the CSA Regulation, the indicators refer to the “material previously detected and identified” as CSAM by any National Authority as well as the relevant administrative or judicial body.[21] For this purpose, the EU Center is going to create three databases of indicators in accordance with Recital 62 and Article 44.1. On the other hand, for unknown CSAM, artificial intelligence will be deployed to single out the CSAM from harmless materials.
  • “New” CSAM. According to Article 2.n., this is still known as CSAM without any indicators and is defined as “material previously detected and identified” as CSAM in Article 44.1.b. Therefore, the National authorities will send the indicators of any new CSAM to the EU Center which will handle the database.
  • ‘Solicitation’ (or ‘grooming’). This has been defined as “solicitation of children for sexual purposes” under Article 2.o.[22] Furthermore, as stated in Article 44.1.c, there will be another database of indicators for the “solicitation material” such as “language indicator”.[23] The identification of the known material is acceptable to some extent, however, for the unknown CSAM, the effects of CSA Regulation seem to be more detrimental as more intrusive measures will be adopted.

A.    Analysis of Known CSAM

When it comes to the detection of CSAM, the problems even get bigger. The EU is planning to use the PhotoDNA in order to detect known CSAM.  The Impact Assessment associated with the Proposal mentions Thorn’s app for detecting abusive material.[24] With regard to the accuracy of these tools, the EU solely relies on the precision rates provided by the developers rather than conducting its own assessment.[25] Moreover, the High Expert Group of the EU Commission themselves point out that PhotoDNA needs to be updated to keep up with the new developments which makes it rather vulnerable to manipulations. Subsequently, the EU seems to deploy the PhotoDNA tools based on the precision rate claimed by their developers and again fails to conduct its own due diligence for the verification of the accuracy. It is noteworthy to mention that contrary to what has been told by private entities developing these tools, Linkedin reports that only 41% of all the identified CSAM by the PhotoDNA turned out to be real CSAM.[26]

B.    Analysis of New CSAM.

As for the new CSAM, the situation is even more atrocious. The AI-driven technology is planned to be put into force for the purposes of the detection of unknown CSAM. However, as it is clear from the Copyright Debates of the last decade, the accuracy rate of this technology is even lower with more cases of false alarms.[27]

It is normal that the detection of new CSAM is harder than those which are already identified and located in the Database. Even in 2022, law enforcement bodies have immense difficulty in distinguishing lawful expression from abusive conduct. Expecting such a correct differentiation from machine learning is farfetched. Therefore, the expected outcome of deploying these tools is having so many false alarms that law enforcement will not have enough time to investigate the actual cases of harmful content circulating on digital platforms. This was exactly the main reason why the previous CSEA Directive could not succeed in the fight against child sexual abuse material since they did not have sufficient human resources. Loading them with more work does not only make any sense but also put literally everybody’s private life at stake as a result of high errors.

These are quite authentic concerns, as the data provided by the META proves that AI-driven technology is prone to giving false alarms.[28] According to the report by META in 2021, they had to activate 207 accounts on Facebook and Instagram after being falsely detected for spreading CSAM and appealed by the users over only two months.[29] Moreover, there are still much more accounts that have also disputed the deletion of their accounts and are under review.[30] In a recent case also, a parent was falsely detected and investigated because of seeking medical advice for his kid.[31] These all again are undisputable evidence of the high rate of false positives identified by the AI-based tools and their gravity.

C.    Analysis of Grooming

Prior to analysing the detection of CSAM in the form of “grooming”, it is noteworthy to define what this term means. According to Articles 3.4 and 5.6 of the CSEA Directive, grooming occurs when an adult contacts the child to involve them in sexual activity or child pornography.[32] Further, the act is qualified as a crime only when the targeted child is under the age of consent in the home State. In order to prevent grooming from happening, CSAR plans to rely on language analysis and predictive tools to analyze the behavioural patterns, which it hopes to stop before it takes place.

However, the compliance of this method with the legality is questionable. In other words, CSAR aims to detect and accuse people of a crime that has not happened yet or has not been intended. Using the same analogy, as the law officers need probable cause for the issuance of the warrant to conduct a search on home and personal belongings, there should be a clear, strong indication for the proposed technique to be lawful. Similar to the new CSAM, the accuracy of the AI-based technology to be used for detecting grooming also predicts the high number of error cases.

D.    The different approach to the children above the age of consent.

The CSAR has adopted the same definition of the CSAM from the CSEA Directive of 2011. According to Articles 2.c. and 2.e., CSAM constitutes “child pornography” and/or “pornographic performance”.[33] The Member States should criminalize activities involving the acquisition, distribution and production of the said materials.

Having said that, under Article 8 of the CSEA Directive, Member States are discrete to criminalize the possession and production of those materials involving children who have reached the age of sexual consent provided that no abuse is being taken place.[34] The age of sexual consent varies from 14 to 17 in the Member States. In other words, the children who are mature enough according to the laws of the Member State are allowed to produce, possess and share sexual photos with their friends/partners without fearing prosecution. However, if those photos are shared with third parties by breaking the trust of the producers, or by hacking, these acts are subject to criminalization. This is the approach adopted by the CSEA Directive by which legislators aimed at ensuring safeguarding the children from probable abuse, while also giving them space to exercise their sexuality.

However, even though CSA Regulation has followed in CSEA Directive’s footsteps on many occasions and has adopted identical definitions, the CSA Regulation seems to have forgotten to incorporate the exemptions to the main rules. Oddly enough, The private exercise of sexuality by those children who are above the age of consent has not been included in the new CSA Regulation. This means that those private uses of sexual materials will also be detectable under the new Regulation since they fall under the definition of the CSAM under Article 2.1 of CSAR.[35] Accordingly, any material being legal or not possessed or shared through digital services will be subject to the CSAM Regulation because of its content.

Consequently, sexual materials used for private purposes will be reviewed by the digital service provider, will be reported to the law-enforcement bodies and will cause their owners to lose access to such platforms if the detection order was issued.

The same mistake has been made with regard to grooming in the CSA Regulation. Article 6 of the previous CSEA Directive prohibited the distribution of solicitation materials among children under the age of consent.[36] However, the new Regulation seems to expand this prohibition to children above the sexual consent age. To illuminate this shortcoming, the definition of the child is given under Article 2.j. of the CSA Regulation which defines the child as being under 17.[37] Following this, Article 7.7 states that detention orders for solicitation will be issued if one of the parties involved is a child.[38] This means that even if one of the parties of the communication is between 14-17 and the Member State has set the age of sexual consent below 17, that distribution of the material involving the said person will still be subject to the detection contrary to the Member State’s laws.

By deviating from the exemptions provided by the CSEA Directive, CSAR has “successfully” broadened its scope in content to any material which can be lawful according to the Member State’s laws though. This omission however might be deliberately made, as the AI-based technologies planned to be used for the detection of the unknown CSAM and grooming will have insurmountable difficulties to read whether the exchange takes place among the children above the age of consent according to the laws of each Member State. Subsequently, this will have drastic effects on the daily life of a wider range of younger generations exercising sexuality by interfering in their private communications.

VI. Analysis of the main obligations.

A.   Risk assessment and mitigation rules.

According to Article 4.1 of the CSAR, the providers ought to take “reasonable measures” in relation to the risks they have identified with regard to their platforms to be used for the online CSAM.[39] In contrast, Article 3.5 requires them to take into account “remaining risk” as well, otherwise being served with the detection and blocking orders.[40] These two obligations are very problematic and inconsistent since whatever the measures are being taken by the hosting service, there will always be room for the online CSAM circulating on these platforms. Thus, irrespective of the said measures, online platforms will be overwhelmed with detection and blocking orders. Moreover, Article 3.2.d requests such platforms to take into consideration “the manner in which the provider designed and operates the service”.[41]

These provisions are specifically burdensome for those services offering end-to-end encryption (E2EE) for their users. The characteristic feature of the E2EE is to allow only the sender and the recipient of the private messages to have access to their content. Thus E2EE creates a handicap for the service providers, as they could not read these messages. Therefore, this case qualifies as a “high-risk” situation. Accordingly, the only way for the service providers to properly mitigate these risks can be to decrease the security of private messages.

Even though the discretion is given to the service providers to take “reasonable measures” in relation to the identified risks, they are held liable if the measure taken has not been sufficient enough to mitigate those risks. Although, it does not declare, but pushes providers to take the most intrusive measure in order to escape liability. Surprisingly enough, Article 4.2 provides little protection to providers against the liability for taking insufficient measures by setting certain criteria to be met.[42] However, the clear way of assessing whether the measures have been proportionate at the moment of the implementation is not clarified in the Regulation.

Though the Regulation has straightforward objectives, it lacks the due process to achieve those goals. It is true that little attention is given to the protection of the service providers against liability claims by very vague, open-to-interpretation and limited criteria, however, it does not provide how the process works in practice. Therefore, facing this uncertainty, the provider is clearly incentivized to take the most disruptive measure as a safety belt.

B.    Age Verification.

The Regulation provides one measure to mitigate the risk which is age verification. According to Article 4.3, the hosting and online communication providers should conduct age verification for their platforms which poses the risk of CSAM circulation.[43] Again, the Regulation has defined the broad scope including almost all the services providing chat function.

Age verification can be carried out by biometric data, but also by the use of identity documents. Both of them however come with risks. Such practices will make the work of some categories of people such as human rights’ lawyers, reporters and whistleblowers tougher. Moreover, the identity document requirement basically means the deprivation of those access to a wide range of interpersonal communications if they lack the proper documents. This will also have a counter-productive effect on undocumented children who want to communicate any sexual abuse they have suffered via online communication services.

C.    Mass surveillance.

The main concern however with regard to the risk assessment and mitigation rules stands with regard to the possibility of generalised scanning under Article 4. As a rule, the scanning should be carried out only if the detection order was issued. However, given the above, what these rules are translated into in reality is quite controversial. Several EU officials including Commissioner Johannson reassured that no scanning can take place without a detection order, therefore there is no risk of undermining E2EE as a whole.[44]

Nevertheless, most of the service providers under the scope of the CSAR are known for providing a highly secure channel of communication for their users. In order to perform the appropriate risk assessment required under Articles 3 and 6, said platforms are encouraged to scan and monitor all the messages.

It is true that the text of the proposed regulation does not really require client-side scanning (CSS) in exact words. However, the Impact Assessment of the CSAR offers some methods to assess and mitigate the risks under Article 4, especially Article 4.1 states technical methods for the designed operation and functionality of the service to take proportionate measures.[45] From reading Section 9 of the Impact Assessment, we cannot conclude what else it might imply other than adopting technical measures to conduct Client-side Scanning which will slap the E2EE. Therefore, it is obvious that even though contested by the Commission, these rules have been designed to enable generalised content monitoring without the detection orders.

D.    Detection orders

According to Article 7 of the CSAR, the Coordinating Authority in each Member State is entitled to issue the detection orders obliging the service providers and hosting organization to detect online CSAM.[46] Moreover, Article 7.4 defines the boundaries of such detection orders: these detection orders can only be issued when there is “a significant risk” of abuse and the positive outcome outweighs the potential negative repercussions. [47]

When it comes to the practice, however, this rule promises that the Member State will have the possibility to issue detection orders to service providers to monitor all the interpersonal communications of their users. In other words, because the detection of any material can only be achieved by the detection orders, the Member States will be more tend to issue as many orders as possible. It is due to the fact that abusive material cannot be distinguished from lawful content without scrutinizing them. Therefore, the said detection orders will amount to the generalized scanning of private communications.

Furthermore, weighing the “significant risk” against the negative consequences is also ambiguous. It is not clarified whether discarding the E2EE is a negative consequence so it can also be permitted if the existence of “significant risk” is justified. On contrary, Article 7.8 states that effective and proportionate safeguards must be in place when it is necessary.[48] Again this vague language does not offer any clue about what is necessary and who decides that.

It is not clear cut why the EU lawmakers have switched from a reasonable suspicion approach adopted in real-world cases to a significant risk test to issue detention orders. This also contradicts the advice of the UN Commissioner of Human Rights to conduct the interference with a private conversation on a case-by-case approach when there is a reasonable doubt to do so.[49] However, the switch to the risk approach implies that they can never truly be on a case-by-case base.

The detection orders will have more detrimental effects on the E2EE. As these systems are developed in a way giving the provider of the services no chance to interfere with private messages, the application of the detection orders will necessitate the technical possibility to read them afterwards. Thus, we cannot talk about the authenticity of the E2EE anymore, as every message can now be subject to generalised monitoring through detection orders.

VII. Effects for E2EE and Client-side scanning

The most principal concern regarding the CSAR is if these new rules result in undermining the E2EE by forcing them to conduct the Client-side scanning and whether this conduct is in compliance with the fundamental rights. Even if the E2EE is at stake, the Commission insists that this can be justified since the encryption is the main obstacle to fighting against the online CSAM.

However, the Commission seems to have forgotten the importance of the E2EE for civil societies on the other hand. If people rely on online communication services to contact each other, it is due to the very existence of the E2EE technologies providing a safe space to exchange information without fearing the exposure of their private lives.[50] Therefore, encryption is an essential human rights instrument.[51] Moreover, The recent opinion from the UN Human Rights Commissioner drew attention to the importance of encryption for the Ukrainians to protect their families from Russian invasion in 2022.[52] If the CSAR passes, the encryption technologies might be entirely dismantled. This in its turn would result in the loss of confidence in the privacy of our communications.

Hence, our analysis has suggested that the online service providers would lower the security of private communications by discarding E2EE in order to comply with the detection orders. The Impact Assessment clearly indicates that the said providers should deploy the Client-side scanning. Commissioner Johannson believes that CSS is the appropriate way to implement detection orders.[53] This has also been approved by the Commission High-Level Technical Expert Group offering three possible detection techniques in the E2EE environment, all of which intend to use the CSS.[54] All of these three most promising methods require the scanning of the content before encryption and after decryption and find no potential harm to fundamental human rights. In contrast, the UN Human Rights Commissioner has identified CSS as a real threat to fundamental human rights in the E2EE environment.[55] Consequently, CSS and the E2EE can not possibly coexist.

VIII. Conclusion.

The CSAR was viewed by the Commission as an effective way of its historical fight against the circulation of child sexual abuse material through different channels of online communications. Nevertheless, as with any piece of legislation, it should be reviewed under the principles of EU Law such as necessity and proportionality. Given the increasing number of cases of child sexual abuse through online service providers, it can be understood that there is a necessity felt by the Commission to adopt an effective law to tackle this problem once and for all. However, the new law cannot come at any cost. As with any other law, even if the Regulation passes the necessity test (which is too very unlikely)  ought to be subject to the Proportionality test as well.

However, the Regulation cannot meet the requirements of the proportionality test. As discussed in Chapter III, the AI-based technologies planned to be used have a high degree of false alarms which signals that there will be unproportionately much more intrusions into people’s private lives.

E2EE is the main guarantor of the many fundamental rights, also societal benefits, therefore the Commission has to weigh very carefully the positive outcomes the Regulation promises against much more far-reaching repercussions from its adoption on private interpersonal communications. As we have argued in Chapter IV of this paper, however, the undermining of the E2EE is almost inevitable with the risk mitigation and detection orders. Thus, instead of achieving its stated aim, the Regulation can blow it up out of all proportion.

BIBLIOGRAPHY

The European Parliament and the Council of EU Regulation 2021/1232 – Temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse [ 2021] OJ L 274/41

Ella Jakubowska, ‘A beginner’s guide to EU rules on scanning private communications: Part 1’ (EDRi, 15 December 2021)

<https://edri.org/our-work/a-beginners-guide-to-eu-rules-on-scanning-private-communications-part-1/ >

CHARTER OF FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION (EU Charter) [2000] OJ C 364/1

Michelle Bachelet, ‘Artificial intelligence risks to privacy demand urgent action’ (OCHR Media Center, 15 September 2021) <https://www.ohchr.org/en/2021/09/artificial-intelligence-risks-privacy-demand-urgent-action-bachelet?LangID=E&NewsID=27469 >

European Parliament and the Council Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector (ePivacy Directive) [2002] OJ L 201/37

Case C-119/12 Josef Probst v mr.nexnet GmbH [2012]

Council of Europe Directorate General of Human Rights and Rule of Law – DG I and Directorate General of Democracy – DG II, Respecting human rights and the rule of law when using automated technology to detect online child sexual exploitation and abuse (Independent Experts’ report, 2021)

European Parliament Research Service, Combating sexual abuse of children Directive 2011/93/EU European Implementation Assessment (Ex-post Impact Assessment Unit, PE 598.614, 2017)

Lutz Ackermann, Robert Bongen, Benjamin Güldenring and Daniel Moßbrucker, NDR, ‘Investigators do not allow images to be deleted’ (tagesschau, 12 February 2021) < https://www.tagesschau.de/investigativ/panorama/kinderpornografie-loeschung-101.html >

Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022]

Ella Jakubowska, “Chat control: 10 principles to defend children in the digital age” (EDRI, February 9, 2022) < https://edri.org/our-work/chat-control-10-principles-to-defend-children-in-the-digital-age/ >

‘Internal documents revealed the worst for private communications in the EU; how will the Commissioners respond?’ (EDRi, April 22, 2022) <https://edri.org/our-work/internal-documents-revealed-the-worst-for-private-communications-in-the-eu-how-will-the-commissioners-respond// >

Alexander Hanff, ‘Why I don’t support privacy invasive measures to tackle child abuse’ (Linkedin, November 11, 2020)<https://www.linkedin.com/pulse/why-i-dont-support-privacy-invasive-measures-tackle-child-hanff>

Felix Reda, ‘When filters fail: These cases show we can’t trust algorithms to clean up the internet’ (Felix Reda, September 28, 2017) < https://felixreda.eu/2017/09/when-filters-fail/ >

META Platforms Ireland Limited, Processing under EU Regulation 2021/1232 Report  (META Transparency Center, 2022)

Kashmir Hill ‘A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.’ (The New York Times, 21 August 2022) <https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html>

Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (CSEA Directive) [2011] OJ L 335/1

Matt Burgess, ‘The EU Wants Big Tech to Scan Your Private Chats for Child Abuse’ (WIRED, 5 May 2022) < https://www.wired.co.uk/article/europe-csam-scanning-law-chat-encryption >

European Commission, Inception Impact Assessment – Ares(2020)7284226 (Regulation of the European Parliament and of the Council on the detection, removal and reporting of child sexual abuse online, and establishing the EU centre to prevent and counter child sexual abuse)

UN Human Rights Office of the High Commissioner, ‘Spyware and surveillance: Threats to privacy and human rights growing, UN report warns’ (OHCHR Media Center, 16 September 2022), <https://www.ohchr.org/en/press-releases/2022/09/spyware-and-surveillance-threats-privacy-and-human-rights-growing-un-report >

Human Rights Watch < https://www.hrw.org/tag/encryption >

Fight for the Future, ‘60+ human rights organizations call on companies to Make DMs Safe’ <https://www.fightforthefuture.org/news/2022-10-13-make-dms-safe-orgs >

Ylva Johansson, ‘Children deserve protection and privacy’ (European Commission blog, 7 August 2022),<https://ec.europa.eu/commission/commissioners/2019-2024/johansson/blog/children-deserve-protection-and-privacy_en >

UN Human Rights Office of the High Commissioner, ‘Spyware and surveillance: Threats to privacy and human rights growing, UN report warns’ (OHCHR Media Center, 16 September 2022), <https://www.ohchr.org/en/press-releases/2022/09/spyware-and-surveillance-threats-privacy-and-human-rights-growing-un-report >

 

CITATION

[1] The European Parliament and the Council of EU Regulation 2021/1232 – Temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse [ 2021] OJ L 274/41

[2] Ella Jakubowska, ‘A beginner’s guide to EU rules on scanning private communications: Part 1’ (EDRi, 15 December 2021),

<https://edri.org/our-work/a-beginners-guide-to-eu-rules-on-scanning-private-communications-part-1/ > accessed 16 October 2022

[3] CHARTER OF FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION (EU Charter) [2000] OJ C 364/1, art 7

[4] Michelle Bachelet, ‘Artificial intelligence risks to privacy demand urgent action’ (OCHR Media Center, 15 September 2021)
<https://www.ohchr.org/en/2021/09/artificial-intelligence-risks-privacy-demand-urgent-action-bachelet?LangID=E&NewsID=27469 > accessed 17 October 2021

[5] Ella Jakubowska, ‘A beginner’s guide to EU rules on scanning private communications: Part 1’ (EDRi, 15 December 2021),

<https://edri.org/our-work/a-beginners-guide-to-eu-rules-on-scanning-private-communications-part-1/ > accessed 16 October 2022

[6] European Parliament and the Council Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector (ePrivacy Directive) [2002] OJ L 201/37, art 15.1

[7] ePrivacy Directive [2002] OJ L 201/37, art 5.1

[8]  Case C-119/12 Josef Probst v mr.nexnet GmbH [2012], para. 23

[9] EU Charter [2000] OJ C 364/1, art 52.1

[10] ePrivacy Directive [2002] OJ L 201/37, art 15.1

[11] Council of Europe Directorate General of Human Rights and Rule of Law – DG I and Directorate General of Democracy – DG II, Respecting human rights and the rule of law when using automated technology to detect online child sexual exploitation and abuse (Independent Experts’ report, 2021) paras. 3.3

[12] Council of Europe Directorate General of Human Rights and Rule of Law – DG I and Directorate General of Democracy – DG II, Respecting human rights and the rule of law when using automated technology to detect online child sexual exploitation and abuse (Independent Experts’ report, 2021) para. 3.3.3

[13] European Parliament Research Service, Combating sexual abuse of children Directive 2011/93/EU European Implementation Assessment (Ex-post Impact Assessment Unit, PE 598.614, 2017) para III.3

[14] Lutz Ackermann, Robert Bongen, Benjamin Güldenring and Daniel Moßbrucker, NDR, ‘Investigators do not allow images to be deleted’ (tagesschau, 12 February 2021) < https://www.tagesschau.de/investigativ/panorama/kinderpornografie-loeschung-101.html > accessed 20 October 2022

[15]Lutz Ackermann, Robert Bongen, Benjamin Güldenring and Daniel Moßbrucker, NDR, ‘Investigators do not allow images to be deleted’ (tagesschau, 12 February 2021) < https://www.tagesschau.de/investigativ/panorama/kinderpornografie-loeschung-101.html > accessed 20 October 2022

[16]Lutz Ackermann, Robert Bongen, Benjamin Güldenring and Daniel Moßbrucker, NDR, ‘Investigators do not allow images to be deleted’ (tagesschau, 12 February 2021) < https://www.tagesschau.de/investigativ/panorama/kinderpornografie-loeschung-101.html > accessed 20 October 2022

[17] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 1.1, 1.2

[18] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 1.1

[19] Ella Jakubowska, “Chat control: 10 principles to defend children in the digital age” (EDRI, February 9, 2022)

< https://edri.org/our-work/chat-control-10-principles-to-defend-children-in-the-digital-age/ > accessed 19 November 2022

[20] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 2.m

[21] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 36.1

[22] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 2.o

[23] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 44.1.c

[24] European Commission, Inception Impact Assessment – Ares(2020)7284226 (Regulation of the European Parliament and of the Council on the detection, removal and reporting of child sexual abuse online, and establishing the EU centre to prevent and counter child sexual abuse).

[25]‘Internal documents revealed the worst for private communications in the EU; how will the Commissioners respond?’ (EDRi, April 22, 2022)

<https://edri.org/our-work/internal-documents-revealed-the-worst-for-private-communications-in-the-eu-how-will-the-commissioners-respond// > accessed 19 November 2022

[26] Alexander Hanff, ‘Why I don’t support privacy-invasive measures to tackle child abuse’ (Linkedin, November 11, 2020)

<https://www.linkedin.com/pulse/why-i-dont-support-privacy-invasive-measures-tackle-child-hanff> accessed 19 November 2022

[27] Felix Reda, ‘When filters fail: These cases show we can’t trust algorithms to clean up the internet’ (Felix Reda, September 28, 2017)< https://felixreda.eu/2017/09/when-filters-fail/ > accessed 19 November 2022

[28] META Platforms Ireland Limited, Processing under EU Regulation 2021/1232 Report  (META Transparency Center, 2022)

[29] META Platforms Ireland Limited, Processing under EU Regulation 2021/1232 Report  (META Transparency Center, 2022), para 6

[30] META Platforms Ireland Limited, Processing under EU Regulation 2021/1232 Report  (META Transparency Center, 2022), para 5

[31] Kashmir Hill ‘A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.’ (The New York Times, 21 August 2022)

< https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html > accessed 19 November 2022

[32] Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (CSEA Directive) [2011] OJ L 335/1, art 3.4, 5.6

[33] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 2.c, 2.e

[34] Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (CSEA Directive) [2011] OJ L 335/1, art 8

[35] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 2.1

[36] Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (CSEA Directive) [2011] OJ L 335/1, art 6

[37] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 2.j

[38] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 7.7

[39] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 4.1

[40] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 3.5

[41] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 3.2.d

[42] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 4.2

[43] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 4.3

[44] Matt Burgess, ‘The EU Wants Big Tech to Scan Your Private Chats for Child Abuse’ (WIRED, 5 May 2022)

< https://www.wired.co.uk/article/europe-csam-scanning-law-chat-encryption > accessed 12 November 2022

[45] European Commission, Inception Impact Assessment – Ares(2020)7284226 (Regulation of the European Parliament and of the Council on the detection, removal and reporting of child sexual abuse online, and establishing the EU centre to prevent and counter child sexual abuse) art 4.1

[46] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 7

[47] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 7.4

[48] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse COM/2022/209 (CSAR) [2022], art 7.8

[49] UN Human Rights Office of the High Commissioner, ‘Spyware and surveillance: Threats to privacy and human rights growing, UN report warns’ (OHCHR Media Center, 16 September 2022),

<https://www.ohchr.org/en/press-releases/2022/09/spyware-and-surveillance-threats-privacy-and-human-rights-growing-un-report > accessed 17 November 2022

[50] Human Rights Watch < https://www.hrw.org/tag/encryption > accessed 20 November 2022

[51] Fight for the Future, ‘60+ human rights organizations call on companies to Make DMs Safe’ < https://www.fightforthefuture.org/news/2022-10-13-make-dms-safe-orgs > accessed 20 November 2022

[52] UN Human Rights Office of the High Commissioner, ‘Spyware and surveillance: Threats to privacy and human rights growing, UN report warns’ (OHCHR Media Center, 16 September 2022),

<https://www.ohchr.org/en/press-releases/2022/09/spyware-and-surveillance-threats-privacy-and-human-rights-growing-un-report > accessed 17 November 2022

[53]Ylva Johansson, ‘Children deserve protection and privacy’ (European Commission blog, 7 August 2022),

<https://ec.europa.eu/commission/commissioners/2019-2024/johansson/blog/children-deserve-protection-and-privacy_en > accessed 20 November

[54]https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online_en

[55]UN Human Rights Office of the High Commisioner, ‘Spyware and surveillance: Threats to privacy and human rights growing, UN report warns’ (OHCHR Media Center, 16 September 2022),

<https://www.ohchr.org/en/press-releases/2022/09/spyware-and-surveillance-threats-privacy-and-human-rights-growing-un-report > accessed 17 November 2022

 

The articles on the LAWELS platform are not, nor are they intended to be, legal advice. You should consult a lawyer for individual advice or assessment regarding your own situation. The article only reflects the views of the author.