Council of Europe’s New Guidelines on Facial Recognition
In a report from January 28, 2021, the Council of Europe (CoE) adopted a new set of guidelines on facial recognition technologies. These guidelines are intended to serve as a reference guide for governments and legislators around the world, but also for developers and manufacturers of facial recognition technologies, as well as for entities using these technologies.
The publication of this report is part of a project of modernization of the Convention 108 (Convention for the Protection of Individuals with regard to the Processing of Personal Data). Opened for signature in 1981, this Convention is the result of the work of the CoE and is the very first binding legal instrument in the field of data protection.
The convention has been open for signature again since October 2018 (hence the decision to rename it Convention 108+) and is currently undergoing numerous amendments, with the express purpose of adapting it to the new realities of an increasingly connected world—of which facial recognition is, undeniably and sometimes to the dismay of some, increasingly a part of.
We have worked with a few clients planning to use or developing products including facial recognition elements - so we understand the arguments on both sides of the coin - and that use of such technologies has strike a fine balance between the advancement of our technological society and of course, our privacy rights as private citizens. In the following article, we therefore offer you a quick overview of these major orientations, as they affect private sector actors. But before anything else, it is important to stop on a potential stumbling block for some of our most informed readers: the overlap between the Convention 108+ and the GDPR framework…
How does it relate to the GDPR?
Some of the guidelines set out in the CoE’s report may seem familiar to those who are well-acquainted with the GDPR regulatory framework - a familiarity that is not so surprising, given that Convention 108 was once the reference point for the 95/46/EC Directive (the forerunner of the GDPR). However, it should be recalled that the GDPR framework is intended to regulate data processing in general - not facial recognition specifically. The international and national legal orders are characterized by a gaping absence of specific legislation on facial recognition.
Thus, it is not certain at all that the practices allowed under the GDPR will also be allowed for facial recognition in the years to come.
To this end, we can, for example, highlight the authorization made under the GDPR for the processing of particular categories of personal data, when these data are clearly made public by the data subject; conversely, Convention 108+ is firmly opposed to the fact that publicly available digital images, such as those published on social media, can be processed by facial recognition technologies.
Even if you are very familiar with the GDPR framework, we encourage you to read the guidelines provided by the CoE or the summary we have made below: this could be an opportunity to spot some of the specific issues of facial recognition, but also to get an overview of the possible future evolution of the law in this sector (since Convention 108+ seems to foreshadow the European regulation in this area).
General guidelines for all private sector actors
The CoE’s report first provides general guidance for all private sector actors, regardless of their role in the facial recognition value chain.
The first of these guidelines relates to consent: Article 5 of Convention 108+ requires that private entities "other than those authorized to perform tasks similar to those of public authorities" and wishing to make use of facial recognition technologies, obtain the "explicit, specific, free and informed consent of the persons concerned" by the processing of the data.
Readers who are familiar with the GDPR framework will not be surprised: it is the quality of the consent collected that prevails here. That is to say, consent will only be considered as having been freely given if the data subject is offered alternative solutions.
The GDPR framework uses the notion of "genuine freedom of choice".
In this case, Convention 108+ is more specific and explains that the alternative must not be too "time-consuming or complicated compared to facial recognition technology" (an effective alternative could be the possibility for the data subject to use a password, for example).
The second major direction is a necessary consequence of the first: Convention 108+ does not recognize the ability of private entities to deploy facial recognition technologies in "uncontrolled environments”. These uncontrolled environments are public or quasi-public transit areas, such as shopping malls, hospitals or schools, where it is by definition difficult to obtain the explicit consent of the individuals.
Most importantly, the report specifies that the mere fact of passing through one of these environments is not sufficient to characterize consent.
What about "developers" and manufacturers of facial recognition technologies?
In addition to these general guidelines for all private actors, the report contains additional guidelines specific to facial recognition developers and manufacturers.
Article 5 of Convention 108+ states that manufacturers are first required to ensure that their algorithms and the data on which they are based are of the highest possible quality. The report cites, for example, the accuracy of the results generated by the algorithms, regardless of the "different camera angles" of the video surveillance systems used to support these technologies. In order to guarantee the quality of the data used, the CoE recommends that machine learning be based on databases made up of "sufficiently diverse photos of men and women, of different skin colors and morphologies, of all ages and from different camera angles”. This reliability of the tools and data used appears to be crucial in view of the potentially significant effects that these facial recognition technologies can have on legal regulation.
In line with the previous point, the report also states that, as a result of the inevitable changes in the shape of faces over time, facial recognition is particularly vulnerable to the progressive deterioration of its reliability levels. Convention 108+ therefore requires that manufacturers and developers make available to those interested in, or using, the technologies dashboards with evolving records of these reliability percentages over time.
Developers and manufacturers are also required to take reasonable steps to educate their target customers, for instance by issuing clear and easy-to-understand recommendations and guidance for audiences who will be subject to facial recognition technology.
Finally, the report recommends the implementation of privacy-by-design protocols, both in the design and architecture of facial recognition products, and in the organizational and internal practices of manufacturers and developers (with particular attention paid to the principles of purpose limitation, data minimization and storage time limitation). Beyond the specific subject of facial recognition, privacy-by-design is also part of the good practices that have become essential in innovative sectors of the economy.
What guidelines for entities using facial recognition technologies?
The CoE's report also contains guidelines for entities using facial recognition technologies - that is, in GDPR language, data controllers and processors.
In addition to being able to demonstrate that the use of these technologies is absolutely necessary and proportionate in the specific context of their use (a principle laid down in Article 11 of Convention 108+ and matching more or less Article 9 of the GDPR), these entities are bound by a number of principles, including: transparency and fairness (e.g., by clearly informing the individuals concerned of the place given to facial recognition in the product or service offered - including whether facial recognition is a mere feature of the latter or forms an integral part of the products and services themselves); purpose limitation; data minimization; storage time limitation; data accuracy; and finally, security.
Here again, the Convention emphasizes the importance of privacy-by-design, as well as the particular obligation of entities using facial recognition to comply with data protection impact assessments.
While these various principles may seem redundant with the GDPR, they cannot be repeated often enough, as they are now an obligatory step for all entities using data. In addition to these obligations, the Convention 108+ also provides for an additional obligation that does not appear per se in the GDPR: "providing an ethical framework" for the use of facial recognition technologies. Given the particularly high risk inherent to these technologies, the report recommends that compliance with this ethical framework take the form of independent advisory ethic boards, composed of experts from different fields of expertise.
Facial recognition technologies raise a multitude of issues and questions that are unique to them.
To this end, they are the subject of legislation and regulations that are currently being drafted, both at the national and international levels. Because this sector is very fluid and can sometimes be difficult to understand, do not hesitate to contact us to discuss it further if you have questions about your business’ use of facial recognition technologies!
Article by Evane Alexandre and Leila Saidi @ Gerrish Legal, March 2021