Recently, the European Union (EU) has moved forward in the fight against economic exploitation of video games players including children. The initiatives are both directly and indirectly relevant for the video game sector. First, the Consumer Protection Cooperation Network (CPC), coordinated by the European Commission (EC), launched an enforcement action against a video game provider and simultaneously released seven key principles about in-game virtual currencies interpreting current EU consumer rules in video games context. Then, the EC launched a call for feedback on its draft guidelines for online platforms on the protection of minors online under the Digital Services Act (DSA). In this blog, we dive into these initiatives and analyse the legal value and impact of these policy moves bringing more safety in online video games looking specifically to virtual in-game currencies. Looking beyond, how do they fit in the Digital Fairness Act proposal context.
The video-game scrutiny momentum.
Currently, there is a clear momentum where the EU examines closer the video games sector, with the aim of providing greater protection to video game players. The video games sector has become the largest entertainment sector globally, driven by rapid technological advancements but is quite a complex sector to regulate with a strong presence of self-regulation.
Indeed, the existing EU legal frameworks applicable to the video games are scattered but comprehensive. However, because these legal instruments do not directly address gaming-specific issues, the sector needs a greater recognition of its specificities in the interpretation of current legal frameworks for legal certainty reasons. In addition, regulating the industry within the EU poses additional challenges due to the interplay of diverse national laws with EU legislations including (regulation and directives), and the rapidly evolving nature of gaming including the apparition of new gaming business models, of gaming platforms and of gaming streaming platforms, underlining the importance of platform regulation.
The EU, through the EC and the National Consumer Protection Authorities (supported by the EC through the CPC network) have recently sped up their efforts towards more safety online for gamers echoing calls from research underlining the numerous risks for all types of gamers.
Academic warnings
Video game users are digital consumers whose rights need protection. However, the interpretation and application of consumer protection rules in the gaming context is not always straightforward. Online games often blend entertainment with complex monetisation strategies — such as in-game purchases, loot boxes, and time-limited offers — which can obscure the line between fair marketing and exploitative practices, especially when children are involved.
“Players increasingly face risks of economic exploitation due to wide deployment of manipulative and exploitative designs in contemporary video games. The widespread adoption of free-to-play (F2P) games has driven game providers to rely heavily on in-game purchases and targeted advertising, incentivizing the use of manipulative “dark patterns” and exploitative designs to maximise player engagement and spending. Some games — particularly those accessed through mobile phones — rely on the extensive collection of personal data to deploy highly personalized marketing strategies. These practices raise serious concerns about privacy violations but also the exploitation of players’ vulnerabilities. The resulting risks include heightened susceptibility to addiction, excessive spending and play-time, and long-term psychological, social, and physical harm.”
Enforcement Authorities Attention
In light of the growing academic attention to risks stemming from gaming, enforcement authorities have started to dig deeper into some of the challenges observed. This includes the consumer protection risks stemming from in-game virtual currencies.
In December 2023, the Netherlands Authority for Consumers and Markets (ACM) fined Epic Games a total of €1.125 million for consumer protection violations in Fortnite related to “misleading and aggressive” monetization and advertising practices. A year prior, the U.S. Federal Trade Commission (FTC) reached a landmark settlement of $520 million with Epic Games for infringement of Section 5 of the FTC Act, due to unfair billing practices, and violations of children’s privacy under the Children’s Online Privacy Protection Act (COPPA).
In Autumn 2024, TikTok was under investigation by the United Kingdom’s Financial Conduct Authority (FCA) because of its Coins system. The coin system was suspected to be used for money laundering and terrorism financing as it was not following the checks and regulation of traditional cryptocurrency exchange organisations. The investigation was also motivated by data security and privacy risks linked to these transactions.
Star Stable Case & Consumer Protection Cooperation Network (CPC) key principles.
In March 2025, the EU Consumer Protection Cooperation Network (CPC) launched an enforcement action against Star Stable Entertainment AB. The aim of the action is to ensure a safer, more transparent experience for players of Star Stable Online, a horse-themed massively multiplayer online role-playing game. The CPC Network found that Star Stable Entertainment AB engaged in several practices that violate EU consumer protection laws — particularly those that could harm children — such as using manipulative advertising, pressuring tactics, and unclear information about in-game purchases.
To clarify the application of consumer protection laws in the area of online games, the CPC Network — the body composed of EU national authorities responsible for enforcing EU consumer protection legislation — issued seven key principles alongside the enforcement action. These principles are intended to help the gaming industry comply with the EU consumer protection rules related to in-game virtual currencies. While not legally binding, these principles reflect the interpretation of the EU consumer authorities and aim to provide guidance to the video games sector on how to comply with EU consumer rules. Failing to observe these principles may lead to enforcement action and penalties.
The Principles
Along with the enforcement action notice, the CPC Network issued seven key principles concerning in-game virtual currencies:
- Price indications should be clear and transparent.
- Practices obscuring the cost of in-game digital content and services should be avoided.
- Practices that force consumers to purchase unwanted in-game virtual currency should be avoided.
- Consumers should be provided with clear and comprehensible pre-contractual information.
- Consumers’ right of withdrawal should be respected.
- Contractual terms should be fair and written in plain and clear language.
- Game design and gameplay should be respectful of different consumer vulnerabilities.
The enforcement of these principles by national authorities, however, remains uncertain, in particular for the provisions of pre-contractual information (principle 4) and the right of withdrawal (principle 5) which are based on the application of provisions of the Consumer Rights Directive (CRD). The application of that Directive is indeed conditioned to the existence of a contract between a trader and a consumer, while such existence is determined by national contract law. The legal qualification of the purchase of virtual items with in-app currencies is, therefore, a case-by-case assessment under national law, which could lead to a fragmented enforcement.
The DSA Guidelines on minor protection
The European Commission also addressed virtual currencies in its draft guidelines on Article 28 of the Digital Service Act (DSA). Article 28 imposes obligations on online platforms to protect minors on their services through a high level of privacy, safety and security. In addition, the provision prohibits advertising based on profiling minors.
While the guidelines have a broad scope and focus on various forms of addictive design, they also specifically address virtual currencies. To ensure the transparency of economic transactions in a age-appropriate way, the Commission considers that providers of online platforms should “avoid the use of intermediate virtual currencies, such as tokens or coins, that can be exchanged with real money and then used to buy other virtual items, which can have the effect of reducing transparency of economic transactions and may be misleading for minors“. From that wording, it is, however, unclear whether the Commission calls for a complete ban of intermediate virtual currencies in platforms accessible to minors or only in the situations where the use of these currencies can mislead minors on the real price of virtual items. If providers of the online platform apply the CPC’s key principles (e.g. displaying the real cost of the virtual in real money) would that information suffice to protect minors?
The consultation closed on 15th June. The Gaming and Regulation Working Group submitted a feedback, which is accessible here. The final guidelines are expected for summer 2025.
Ways forward – Digital Fairness Act a necessary legal move?
In 2022, the EC launched a Fitness Check of EU consumer law on digital fairness in order to assess if the current EU consumer rules are fit to ensure a high level of consumer protection in the digital environment. As follow-up from this Fitness Check, the European Commission is expected to release a proposal for a Digital Fairness Act (DFA). This new piece of legislation is meant to further tackle dark patterns in digital services, including in video games. A call for consultation on the DFA should be opened soon. The DFA is expected to include legal provisions addressing harmful online practices including manipulative and addictive design features. The exact scope and content from the DFA proposal remains to be seen. One can wonder why such an instrument is deemed necessary and what gaps could it fill in? Let’s zoom on the relevant legal frameworks helping to achieve digital fairness and their current limitations.
EU consumer rules
Alongside virtual currencies, a wide range of other practices can also lead players into excessive or unwanted spending or addictive behaviours. In its resolution on addictive design of online services, the European Parliament warned about the negative impact that “addictive designs” can have on consumers, especially for younger consumers. Among these designs, the EP mentioned some that are particularly relevant in a gaming environment, such as content that is temporarily available (ephemeral content), various incentives for continued engagement (e.g., badges, rewards, etc.) or – conversely – penalties for disengagement, or more generally interaction-based recommender systems and notifications that are delivered during or outside of the consumer’s interaction with the digital product or service. In the Fitness Check of EU Consumer Law on Digital Fairness, the EC specifically pointed to addictive designs in gaming, in particular the sale of virtual items, including uncertainty-based rewards (e.g., loot boxes), and the use of intermediate in-app virtual currencies. The aim of this Fitness Check is to assess whether existing EU consumer law adequately addresses new challenges posed by the fast pace of developments in digital markets and the increase in EU legislation in the digital sector. The Commission concluded that existing “EU consumer law cannot be considered as sufficiently clear or effective in addressing the multifaceted harms resulting from interface designs and functionalities that induce digital addiction, which impairs consumer decision-making and puts vulnerable consumers, in particular minors, at a heightened risk.”
Although the Unfair Commercial Practice Directive (UCPD) could address addictive designs under the Commission’s broad interpretation of a consumer’s ‘transaction decision’—which includes not only purchasing decisions but also decisions to continue using a service—this interpretation is not explicitly stated in the UCPD’s legal provisions. The UCPD only mentions the economic interests of consumers, and there have been varying national rulings on whether non-material harms from unfair commercial practices can be redressed. It is, therefore, unclear whether the UCPD provisions can apply to addictive designs which – although leading to excessive or unwanted spending of time – do not result in a purchase.
Intermediary Services Regulation
Beyond consumer law, the Digital Services Act (DSA) provides protection against addictive designs, in particular if they concern minors. Art. 28 requires providers of online platforms to ensure a high level of privacy, security and safety for minors in their platforms. In its draft guidelines on Art. 28, the European Commission highlights that minors should not be exposed to practices that can lead to excessive or unwanted spending or addictive behaviours, which entails preventing minors accessing uncertainty-based rewards (e.g. loot boxes). There is growing evidence of the negative effects these features have on consumers, particularly young people who are more vulnerable to their addictive nature. In parallel, Arts. 34 and 35 require very large online platforms (VLOPs) and very large search engines (VLOSEs) to identify and mitigate systemic risks, including any actual or foreseeable negative effects to the respect of the rights of the child, to a high-level of consumer protection, and serious negative consequences to the person’s physical and mental well-being. VLOPs and VLOSEs shall, therefore, ensure that designs which can induce digital addiction are absent from their services. Nevertheless, these provisions are only applicable to services which qualify as “online platforms” under Art. 3(i) of the DSA, which leaves a wide range of video games out of the scope.
AI regulation
Finally, the prohibition outlined in Article 5 of the AI Act could apply to certain addictive designs if they involve using an AI system that is intentionally manipulative or deceptive, or exploits vulnerabilities related to age, disability, or specific social or economic situations. However, the use of such AI systems must aim to, or result in, significantly distorting the behavior of a person or group, causing or likely causing significant harm. This requires proving the existence of significant harm, which can be challenging in practice.
Conclusion
The Court of Justice of the European Union (CJEU) has acknowledged that video games are complex cultural products, combining technology and creative content. This complexity is reflected in the fragmented regulatory landscape unspecific but still applicable to the video games sector. Greater recognition about gaming in the interpretation of current legal frameworks is necessary. The CPC Network’s interpretative document and the EC Guidelines on the protection of minors contribute to bridging this gap. They constitute an important step forward towards more transparent and fair gaming ecosystems—and should serve as a wake-up call for the industry to take gamers protection seriously.
Let’s not undermine that in-game currencies and dark patterns raise not only concern for consumer protection but also concerns under data protection and media law. Addressing these challenges will require robust cooperation and coordination between different regulatory authorities, whose mandates increasingly overlap in the context of video games. In addition, some instruments being Directive such as the UCPD, a lot depends on the MS national implementation and enforcement. There is a real need for consistency to avoid fragmentation in the gamer’s protection.
In this context, the impact of the forthcoming DFA on gamers/consumer protection will have to be monitored when the proposal is published. The DFA should clarify and solve the terminology issues by clearly delineating the identification criteria and distinctive elements for dark patterns and manipulative design. Additionally, conceptualising vulnerability with more legal certainty is also necessary. Being vulnerable extends beyond being a minor, what scope exactly will be given to the protection of vulnerable individuals remains to be clarified. Also, having a strong role for an existing authority such as the CPC to ensure a correct integration and interplay of this new legislation into the existing EU legal framework would be valuable.
As video games continue to evolve into complex, immersive digital environments, regulators must keep pace to ensure consumer rights are upheld and no critical blind spots are left behind.
Research Context
The research for this blogpost has been funded by the following research projects focusing on ethical and legal considerations relevant to video games, namely PROGRESS and i-Game (European Union’s HORIZON research and innovation programme under grant agreement No 101132449).
Thanks to these research projects, Noémie Krack & Martin Sas are part of the Gaming & Regulation Working Group coordinated by the New York University (NYU). The aim of the working group is to advance constructive regulation of the video games industry. The working group brings together regulators, representatives of the gaming industry, and civil society researchers on a weekly basis to discuss and achieve consensus on concrete regulatory measures needed to address harms in online gaming
The Working Group just launched a monthly newsletter entitled the Gaming & Governance Brief. The newsletter will keep its subscribers informed on the group’s latest activities, provide expert insights into gaming regulation and governance, and highlight valuable resources for deeper understanding.
You can subscribe to the newsletter here. Missed an issue? Catch up on past editions
Bios


Martin Sas is a doctoral researcher at the Centre for IT and IP law (CiTiP). Under the funding of FWO PROGRRES project, he focuses his research on the protection of children’s privacy in video games with the objective to build a right-based privacy risk rating system for online games. Martin explores a wide range of topics, including data protection regulation and children’s rights, age-appropriate designs, behavioral designs and dark patterns, age verification systems, privacy-enhancing techniques, and risk management methodologies.
Noémie Krack is a PhD researcher at the Centre for IT and IP law (CiTiP) of KU Leuven – imec. Her research focuses on media law, artificial intelligence and the challenges that technology raises for fundamental rights. In her Ph.D. she explores the evolution of the EU regulation on non-consensual sexually explicit and intimate deepfakes (NCSID). She works and has worked on several European Union interdisciplinary projects on technology and media such as i-Game, AI4Media (Artificial Intelligence for the Society and the Media Industry) and MediaFutures. She is also an Editorial Board in Chief of the AI Media Observatory. The Observatory publishes content focusing on the impact of AI in the media sector from an interdisciplinary perspective.
Photo (license free): pexels