Posted on Categories creative industry

Game chat crimes: Does the developer have a duty to report them?

Multiplayer gamesare an increasingly important market segment. All kinds of features allowing players to communicate during gameplay, such as voice communication and text chats, largely account for their popularity. While player interaction is desired by players themselves and game developers, it can involve inappropriate and even unlawful player behaviour. What is the provider of an online game to do in such a situation?

A chat in a multiplayergame can even be used to commit a crime. Therefore, providers of online games (in particular developers, but also publishers and distributors) need to be aware of the obligations this may entail. This is important given that a large number of gamers are younger people who may be exposed to criminal activity by other game users.

Player misconduct

One form of regulating behaviour in chat rooms or other instant messaging features is appropriate policies or provisions in the terms and conditions—but not always, and in any event, not all players will comply with the rules.

Improper player behaviours can vary widely, from inappropriate nicknames (e.g. containing swear words, or references to totalitarian systems or sexual activities) to vulgar, discriminatory or violent statements during the game.

Sometimes, player behaviour can take an extremely abusive form that constitutes a crime. There are even cases of children being groomed on game sites by abusers, to exploit them at a later stage.

Most frequently, information on inappropriate behaviour reaches the game provider in the form of a report submitted by other players or their parents.

Duty to respond to unlawful player behaviour

In principle, any entity providing a game for online play is an entity providing its services electronically. The provider of such services bears limited responsibility for content published by users, under the “notice and takedown” approach.

In other words, service providers can be held liable for actions of users of their service only if they have been made aware of the publication of unlawful content and have not made efforts to remove such content immediately. In principle, they do not have a duty to independently and actively monitor users for the unlawfulness of the content published by them.

However, the situation is different when it comes to user behaviour that may constitute acts prohibited by criminal law. In such cases, one should consider whether the game provider is obliged to notify the appropriate law enforcement authorities of suspected criminal offences.

When a game is a space for committing crimes

In Poland, as a rule, natural and legal persons have only a social obligation to notify law enforcement authorities of a suspected crime. This means that in most cases a person (natural or legal) who has such a suspicion should report this to law enforcement authorities, but it is only an ethical obligation, and failure to do so does not entail any negative legal consequences, and in particular is not punishable.

However, the principle of a purely social obligation to report offences is subject to exceptions for about a dozen particularly serious offences identified by the parliament, including certain sexual offences against minors under the age of 15. For these crimes, considered by the legal system to be particularly serious and harmful, a higher standard of reporting is established for persons who become aware of such offences.

The higher standard is that a natural person (an offence under Criminal Code Art. 240 may only be committed by a natural person) who has credible knowledge of criminal preparation, attempt or commission of a criminal offence included in the statutory list (Art. 240 §1) or a terrorist offence (under Art. 115 §20, committing or threatening a criminal offence punishable by a maximum of at least 5 years, involving serious intimidation of many people, forcing public authorities to take or refrain from taking a specific action, or seriously disrupting public order or the economy), has a legal obligation to immediately notify the law enforcement authorities, or they risk their own criminal liability, punishable by imprisonment for three years or more. (An exception to this rule is the failure to notify by the victim of such a prohibited act, as well as failure to notify due to fear of criminal liability threatening that person or a family member—in either case, the failure to notify is not punishable. Moreover, a person does not commit a crime under Criminal Code Art. 240 §1 if he or she failed to notify, having sufficient grounds to believe that law enforcement authorities knew about a criminal act that was being prepared, attempted or carried out. The offence of failure to notify is not committed by anyone who has prevented the commission of a criminal act that is being prepared or attempted.)

What offences must a game provider report to law enforcement authorities in Poland?

Most of the offences included in the catalogue of prohibited acts that must be reported to law enforcement authorities under penalty of criminal liability cannot be committed in a chat room, and therefore information about such offences will rarely reach game providers.

The list includes crimes of a terrorist nature and certain crimes against humanity, against the Republic of Poland (coup d’état, espionage, an attempt on the life of the President of Poland), terrorist attack, crimes against life and health (murder, grievous bodily harm), causing a catastrophe, piracy, deprivation of liberty, hostage-taking, and certain sexual offences: aggravated rape (gang rape, rape of a minor under age 15, incestuous rape, rape with particular cruelty), exploitation of a helpless victim, and sexual offences against a person under 15 years of age, as described in Art. 200 of the Criminal Code.

However, there are types of criminal acts that could be committed in a game chat. For example, in a chat room, a prohibited act may be committed by presenting pornographic content to a child (a minor under 15 years of age) or distributing such content in a manner allowing the child to become acquainted with it (Criminal Code Art. 200 §3) (if the game is played or can be played by persons under 15 years of age). Even if an in-game chat is purely text-based (does not allow video or audio transmission), a minor could be presented with pornographic content or pornographic content could be disseminated there. Each case should be evaluated individually and analysed not only for the content itself, but also for the context of the statement.

International problem, EU and Polish preventive action

These issues are not limited to Poland and its legal system. Similar regulations and obligations are in force in many countries around the world, some of them based on special laws for protection of minors. Given the international nature of multiplayer games, it should be borne in mind that in other jurisdictions, the obligations of game providers to monitor and respond to reports of possible crimes may be more stringent than under Polish law.

At the EU level, legislative work is being carried out on provisions imposing an obligation on providers of e-services to actively detect and report cases of online child sexual abuse. In Poland, the National Centre for Research and Development, the internet administrator NASK, the Warsaw University of Technology and Dyżurnet.pl are implementing a project called “System reacting to threats to children’s safety in cyberspace with special attention to child pornography” (acronym APAKT), under which a system and algorithm are being created to facilitate the detection and combating of illegal and sensitive content (more about this project from Dyżurnet and Enamor).

Developer’s responsibilities and best practices

It is crucial that the developer’s response to chat incidents be prompt but also appropriate. Just as it is punishable to fail to report certain criminal acts, it is also possible to be held criminally liable for making a false report or false accusation. When deciding to report a user to law enforcement, the game provider must also consider the issue of protection of personal interests, and the risk of possible civil claims for infringement, such as defamation.

Therefore, it is essential to be prepared in advance to respond to such incidents in an organised and proportionate manner. A good way to do this is to put in place the right internal policy. On that basis, the company’s employees will know how to behave, how to assess and process reports of misconduct by players coming from other users, which reports to prioritise, and what verification actions to take. Such a policy should be drafted in consultation with a lawyer to ensure that it complies with all legal obligations and data protection rules.

Maria Kozłowska, Jakub Barański