The European Data Protection Board’s Draft Guidelines for Search Engines and the Future of the ‘Right to be Forgotten’ Online, Part 2 – David Erdos

This is the second part of a post dealing with the European Data Protection Board (EDPB)’s draft guidelines on the right to be forgotten.  Part (1) dealt with the scope of the guidance and of ex post rights vis-à-vis search engines.  This post will deal with (2) the substantive grounds for exercising these ex post rights, and (3) the substantive exemptions from these ex post rights.

2.  The Substantive Grounds for Exercising Ex Post Data Protection Rights vis-à-vis Search Engines:

The current draft of the Guidelines provide a very comprehensive analysis of the grounds upon which an individual, concerned about search engine processing, may deploy when exercising the right to erasure (GDPR, art. 17).  However, coverage of other potentially relevant data subject rights is very limited and there are some problems also with the detail of the analysis of the right to erasure itself.

Turning first to consider the grounds for exercising the right to erasure, the notion of personal data being “unlawfully processed” (GDPR, art. 17(1)(d)) should be construed so as to ensure effective and complete protection for the data subject and, in particular, that as far as possible a clear remedy is provided for infringement of any relevant duty which the GDPR requires (which, in relation to search engines, are at least all those set out in GDPR, chapter II).  As regards the four-corners of the GDPR, this concept should not therefore be limited to Article 6 of the instrument, which merely establishes one mandatory element of lawfulness.  Indeed, it is notable that in Google Spain itself there was as much emphasis on the core data protection principles now set out in Article 5.  The data protection principles were also central to the way in which the much more recent case of GC et. al. v CNIL assessed the processing of past criminal proceedings.

This reality is not currently well captured in the discussion found in the Guidelines which could be read as limiting Article 5 merely to the “interpretation” of Article 6 (p. 6).  It would be particularly strange if a data subject was able to rely on a legal provision which was not even set out in the GDPR (as the Guidelines currently clearly state also at p. 6) but not on a breach of duty found not within Article 6 but, say, in others parts of chapter II.  Of course, all these duties and indeed the exercise of data subject rights must be construed consistently with freedom of expression (including its sub-right of freedom of information) and, as detailed earlier in this discussion, in line with the derogations established under Articles 9(1)(g), 10 and 23(1)(i).  Indeed, it is notable that the Court in GC et. al. v CNIL held that the sensitive data derogations set out in Articles 9(1)(g) and 10 should be construed to be self-executing in the absence of implementing law enacted at either national or Union level.  The same should be true of Article 23(1)(i), the overarching “rights and freedoms of others” derogation.  It should, however, be clear that even if the right to erasure itself is restricted in some way under Article 23(1)(i), it will remain necessary for the data subject to retain an ability to ensure that, where an infringement of the data protection principles of Article 5 or the legal grounds for processing in Article 6 are demonstrated, processing is brought into compliance with these standards.  A purported derogation which makes this impossible would therefore not be compatible with the standards Article 23(1)(i) itself lays down.

Secondly, contrary to the suggestion in the draft Guidelines (p. 9), it seems very doubtful whether the ground relating to the direct provision of information society services to a child (GDPR, art. 17(1)(f)) is in fact engaged since any child claimant will not have had a direct interface with the search engine but only with another online service such as a social networking site.  Nevertheless, it is important to highlight that a child’s interests should still be granted particular weight in any deindexing claim (see, in particular, GDPR, recital 38 and article 6(1)(f)).

Thirdly, many of the other applicable legal grounds for erasure should in essence be conceived as elucidations of the overarching legality principle set down in Article 17(1)(d).  Thus, if processing by a search engine is “no longer necessary” (GDPR, art. 17(1)(a)) it will also violate Article 6(1)(e) and so its ongoing processing will not be authorised.  Similarly, if data have to erased in compliance with a legal obligation (GDPR, art. 17(1)(e)) then ongoing processing will also clearly be unlawful.  These perhaps somewhat esoteric points aside, the analysis of these other provisions in the draft Guidelines are however helpful.

Fourthly, the draft Guidance is right to recognise that the right to erasure may be triggered by a bona fide exercise of the right to objection (GDPR, art. 17(1)(b)).  In principle, however, data subject rights including the right to object can be restricted in order to protect the rights and freedom of others under Article 23(1)(i).  Nevertheless, especially in the context of any potential self-enforcement of these restrictions as opposed to the enacting of specific legislation at Union or Member State level, it seems unlikely that a court would see any direct limitation of this right here as compliant with the required standards of respect for the essence of the right, necessity and proportionality in a democratic society.

Finally, the draft Guidelines are also correct to note that, separately from the right to erasure, the right to object can be used as an independent ground for deindexing.  Indeed, Article 21(1) of the GDPR specifically states that the successful exercise of the right to objection must result in the controller “no longer process[ing] the personal data”.  Indeed, given that deindexing generally focuses on a particular processing operation (e.g. a name-search or an image-based search), the exercise of a “right to object” to specific “processing” appears particularly apposite.  In addition, and as previously mentioned, although not explored in Court of Justice case law to date, it is also difficult to argue that the right to rectification of inaccurate or incomplete data (GDPR, art. 16) and the right to request a restriction of processing (GDPR, art. 18) should not have any application here.  They, therefore, deserve a mention in a final Guidance.  Nevertheless, the balance with freedom of expression (and information) is particularly delicate here and it may be that it will be necessary to limit these rights to some an extent under Article 23(1)(i).  This reconciliation between ex post rights and freedom of expression is the particular focus of the final section of the Guidelines and will also be the last issue addressed in this blog.

3.  The Substantive Exceptions for Search Engines from Ex Post Data Protection Rights:

The Guidelines are absolutely right to explicitly focus on the need to reconcile these ex post rights with freedom of expression including its sub-right, the freedom of information.  However, the draft currently places too much of the attention on the specific exceptions to the right to erasure set out in Article 17(3).  Rather than focusing on the specific rights of data subjects to secure compliance with the law, the logical starting point would be to delimit the substantive duties of controllers and to ensure that these are consistent with fundamental freedoms.  Secondly, as the commentary found in the draft Guidelines itself demonstrates, most of the specific restrictions listed in Article 17(3) are not particularly relevant.  Indeed, it is really only the exemption from the right to erasure where processing is “necessary for exercising the right to freedom of expression and information” (GDPR, art. 17(3)(a)) that might have any real bite here.  Thirdly, and again as the draft Guidelines themselves acknowledge at least in relation to the right to object, other subjective rights may be independently invoked in order to secure ex post action on the part of a search engine.  These other rights must also be reconciled with freedom of expression.

Turning to consider the fundamental duties placed on controllers, these obligations including in particular the data protection principles (GDPR, art. 5) and the legal grounds for processing (GDPR, art. 6) must always be construed consistently with freedom of expression as far as this is possible.  In certain cases, the processing of specific data by search engines might be considered to be prima facie inconsistent with these provisions and yet stopping all processing would also disproportionately impact freedom of expression.  A clear example is contained in GC et. al. v CNIL where it was recognised that the ongoing dissemination of out-of-date information concerning an individual’s legal proceedings might prima facie data protection principles related especially to the relevance standard (at [74]-[75]).  However, the initial reaction to this should be to consider whether some lesser form of action might result in processing which (having regard to the importance of freedom of expression) can itself be seen as reflective of the underlying legal duty.  Indeed, it is in this context that the Court in GC et. al. v CNIL mandated that a search engine must after requisite notice ensure that its overall profiling of the data subject is reflective of their current legal position including, in particular, through placing a link containing up-to-date information first in any returned listed (at [78]).  Indeed, even more so that in the Directive, there is a strong presumption that both the data protection principles and the legal grounds for processing should not be entirely disapplied (outside of the journalistic and other special expressive purposes (GDPR, art. 85(2)) which are not applicable here).  Thus, Article 23(1) very carefully goes through the permitted derogations from chapter II and III of the GDPR – the core of data protection’s substance – and deliberately elects not to allow for any derogation of Article 6 or any exemption as such from Article 5 either.

Nevertheless, it must be recognised that certain data protection duties (and potentially also some of the data subject rights like the right to rectification and to restriction) may set out peremptory obligations which are simply inconsistent with freedom of expression.  Again, the GC et. al. v CNIL case explored examples in the default restrictions on the processing of special category and criminal-related data which are set out in Article 9(1) and 10 of the GDPR.  In these contexts, and notwithstanding that the data subject can point to a prima facie violation, a derogation must be provided for.  Relevant derogatory standards are laid down in Article 9(1)(g) as regards the lifting of the prohibition of processing special category data, Article 10 as regards the lifting of restrictions on processing of criminal-related data and otherwise in Article 23(1)(i), the “rights and freedoms of others” clause.  In GC et. al. v CNIL the Court indicated that the derogations in Article 9(1)(g) and Article 10 should be treated as self-executing in the absence of implementing Union or Member State legislation.  However, since it was also noted that the exercise of any derogation must be “in compliance with the conditions laid down in those provisions”, it is clear that legislatures remain competent to (and indeed should) enact such rights-complaint specifying law.  Finally, in the absence of this, the direct use of the derogation should be limited to that which is “strictly necessary” (at [68]).  This construction makes sense of an admittedly complex and imperfect legal situation.  It should therefore also be extended and applied to derogations more generally using the general restriction standards laid down in Article 23(1)(i).

As previously emphasised, specific data subject rights should be interpreted consistently both with freedom of expression and with these core data protection duties.  It follows that, in relation to the right to object, the controller should be able to justify ongoing processing where there is a “compelling” (GDPR, art. 21(1)) case under freedom of expression for continuing to ensure access to the information through the relevant processing at issue (e.g. a name-based or image-based search).  Meanwhile as regards the right to erasure and where a prima facie breach of duty has been established, the exemption for processing necessary for the exercise of the right to freedom of expression and information (GDPR, art. 17(3)(a)) should be interpreted in line with the derogatory scheme above.  In other words, where the prima facie breach relates to the data protection principles (GDPR, art. 5) or the legal grounds for processing (GDPR, art. 6) then all reasonable steps should be taken to ensure that, after construing the with regard for freedom of expression, the processing as a whole complies with these provisions.  In other cases, the derogatory tests set out elsewhere in the instrument should be applied (directly if necessary), namely, Article 9(1)(j) (where the prima facie breach relates to processing special category data without a clear and sui generis legal ground), Article 10 (article 9(1)(j) (where the prima facie breach relates to processing criminal-related data without a clear and sui generis legal ground) and Article 23(1)(i) (in all other cases).  The detail of the mandatory safeguards built into to these derogations do differ somewhat and Article 10 concerning the criminal-related data restriction is particularly unique.  Nevertheless, alongside the contextual interpretation of Articles 5 and 6, they can establish a structured framework for ensuring that the essence of substantive data protection is maintained here, whilst also ensuring necessary and proportionate limits to this in order to safeguard the exercise of freedom of expression and information in new online services such as search engines.

David Erdos, Faculty of Law and Trinity Hall, University of Cambridge

from Inforrm’s Blog https://inforrm.org/2020/02/13/the-european-data-protection-boards-draft-guidelines-for-search-engines-and-the-future-of-the-right-to-be-forgotten-online-part-2-david-erdos/
via Security CCTV Installs

from Blogger http://cctv-camera-installations.blogspot.com/2020/02/the-european-data-protection-boards_12.html
via PTZ Camera Installers

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s