EU regulations banning certain AI practices go into effect on 2 February 2025. Some institutions may assume that the bans only apply to extreme practices, which they would never be involved in. But the ban on using AI systems to assess the risk of that someone has committed a crime, or will commit a crime, shows that this is not the correct approach. A more in-depth analysis reveals that some market practices now considered standard, especially in financial services, may prove questionable once the bans enter into force. This is particularly true for monitoring of money-laundering risk and more broadly the risk of fraud.
new tech law blog
Monitoring fraud under the Artificial Intelligence Act
The Digital Markets Act: A revolution, and not only for gatekeepers
The Digital Markets Act or DMA (Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector), which entered into force on 1 November 2022, creates many new obligations for businesses operating in the digital sector, particularly so-called “gatekeepers.”
The DMA will impact the functioning of the entire digital ecosystem—not only gatekeepers, but also other participants in digital markets, including business users and end users of core platform services, competing providers of core platform services, and providers of other digital services.
This is because the obligations and prohibitions imposed on gatekeepers will either directly or indirectly vest other groups with rights they can pursue before national courts.
Why did I sign an appeal to halt AI development?
Regardless of whether we see benefits or an existential threat in the latest AI technologies, the gravity of the challenges these technologies bring is undeniable. Over the past few decades, technological advances have far outpaced reflections on their possible consequences. This need not and should not be the case. That technologies are not solely a source of good is becoming apparent today as we begin to perceive the destructive impact that certain digital technologies have on our democracies, security, and mental health. In the face of recent technological advances, such as artificial intelligence, we have an opportunity to avoid mistakes and at least try to redirect the development of these technologies toward authentic benefits, while at the same time mitigating risks. In this context, I decided to sign an appeal to temporarily halt work on AI systems. I also encourage others to do so. Below I present the main rationale that guided me.
“Dark patterns” targeted by EU institutions
“Dark patterns” used by online platform providers have been controversial for some time, but recently there has been a growing buzz about them, in particular due to actions undertaken by EU and national data protection and consumer protection authorities. (For an overview of cases and decisions by EU and national authorities, see the European Commission’s “Behavioural study on unfair commercial practices in the digital environment: Dark patterns and manipulative personalisation, Final Report,” pp. 61–70.) Primarily, these measures are intended to combat deceptive practices in the digital environment, but also to educate consumers and draw their attention to the most common types of practices.
The harmfulness and prevalence of dark patterns has also been noticed by EU lawmakers, who expressly banned such practices by online platform providers in Art. 25(1) of the Digital Services Act (Regulation (EU) 2022/2065 on a Single Market for Digital Services—DSA). The DSA entered into force on 16 November 2022, but most of the obligations in the regulation will apply from 17 February 2024. Therefore, the application of dark patterns may violate not only data protection laws (especially the General Data Protection Regulation) and consumer protection laws, but also (from February 2024) the Digital Services Act.
“Bossware” under labour and data protection law
The proliferation of remote work, combined with the development of monitoring technologies, has led employers around the world to implement various, sometimes technologically advanced methods to check employees’ performance and commitment to their work. In this area, IT solutions and programs commonly called “bossware” are gaining popularity.
In practice, bossware can include a variety of solutions and technologies, such as:
- Keyloggers monitoring the employee’s use of the keyboard on a company computer
- Downloading and analysis of screenshots from the employee’s business device
- Tracking mouse movements
- Constant or periodic observation of employees using the camera (e.g. eye movement) or microphone on a business device
- Tracking the employee’s online activity
- Monitoring the use of business email, calendar and business messaging
- Analysis of the performance of applications and programs run by the employee.
A specific feature of bossware solutions is the frequent use of automated analysis to flag employees whose productivity, commitment or manner of work deviates from the employer’s expected norm, without their superiors’ involvement.
Polish employers are also reaching for bossware. In this regard, we describe below what they should take under consideration in light of Polish labour law and data protection law.
Standard contractual clauses need to be updated by 27 December 2022
Entities transferring personal data outside the European Economic Area on the basis of standard contractual clauses that are no longer in force (where the transfer began before 27 September 2021) should conclude agreements based on new clauses by 27 December 2022.
Under the General Data Protection Regulation, the transfer of personal data to “third countries” (outside the European Economic Area) is only permitted if the conditions set forth in the GDPR are met, i.e. generally when:
- The transfer is made to a country which the European Commission has determined provides an adequate degree of protection (i.e. it has issued an adequacy decision—decisions issued so far are available on the European Commission website)
- If there is no adequacy decision, then adequate safeguards are provided, including in the form of conclusion of an agreement based on standard contractual clauses between the entities involved in the transfer
- If there is no adequacy decision or adequate safeguards, then one of the special circumstances specified in the GDPR applies.