A fine for facial recognition
Recently, the Swedish supervisory authority responsible for compliance with the General Data Protection Regulation imposed a fine of approximatively EUR 20,000 for the use of technology to monitor students’ attendance. Importantly, the processing of personal data in the form of images of students was not carried out on a permanent basis, but was a short-term test to assess the usefulness of such a solution in the schools’ activity.
However, according to the Swedish authority, such activity was in breach of the GDPR. Notably, the systems processed the image in such a way that biometric data, i.e. sensitive data within the GDPR framework, were processed. Because of the specific nature of the processed data and the way they were processed, the data controller should have carried out a data protection impact assessment prior to the start of processing. In turn, the failure to carry out this assessment constituted a breach of the GDPR of such a nature that imposing a fine on the data controller was justified.
According to the GDPR, a data protection impact assessment should be carried out where a type of processing, in particular involving new technologies, is likely, by its nature, scope, context and purposes, to pose a high risk to the rights or freedoms of natural persons. The GDPR does not contain a specific catalogue of situations where the risk assessment is mandatory, but merely indicates categories of cases where it is required. These include situations where:
- There is a systematic, comprehensive assessment of personal factors relating to natural persons
- Automated processing, including profiling, takes place and is the basis for decisions having legal effects on a natural person or similarly significantly affecting a natural person
- There is large-scale processing of special categories of personal data, e.g. genetic data or biometric data for the purpose of unambiguously identifying a natural person, or health data
- There is systematic and large-scale monitoring of publicly available sites.
Supervisory authorities of individual states prepare lists detailing situations in which an assessment of the processing effects is required. We wrote about the list prepared by the Polish authority here.
The above list specified in the GDPR clearly suggests that any entity processing e.g. health data using algorithms should conduct a preliminary analysis of the need for a risk assessment prior to commencing its activities.
Moreover, if the assessment carried out shows that processing would present a high risk for data protection if the data administrator did not take measures to minimise the risks, prior to commencement of processing the data controller is obliged to consult the supervisory authority, which will then make recommendations to the data controller.
It follows that for certain business models providing for the processing of personal data in a manner, quantity or type making it necessary to assess the data protection processing implications, the controller of such data needs to comply with certain GDPR obligations prior to the start of its activities. Otherwise, it may expose itself to inspection by the supervisory authority, which may result in an administrative decision, including imposition of fines for non-compliance with the GDPR.
Katarzyna Szczudlik