How recent laws impact game design, from in-game chat to notifications
2026 is already proving to be the year when the reality of video game regulation hits home for the industry.
Games studios are suddenly confronting a host of new and impending regulations, and it can feel an impossible task to sift through the alphabetti spaghetti of laws, rules and guidelines (OSA, DMCCA, DFA, DSA, AI Act, CPC…) in order to know what this actually means for the day-to-day practicalities of developing and releasing games.
With that in mind, here are some of the key game features that are being (or are soon to be) impacted by all these fun new acronyms.
1. Dark Patterns
Dark patterns are at the heart of many of the laws currently being applied to game features. In short, they are design strategies which seek to manipulate players into taking actions they might not otherwise want to take.
Think of them like old school, rigged carnival games which are manipulated to make winning harder than it appears: the digital equivalent of weighted bottles, underinflated balloons or rings that are too small for the pegs. Dark patterns deceive, obfuscate and put emotional pressure on players to get them to give more of their time and (in many cases) money to the game.
For example, this could be to encourage impulse purchases, to make quitting or opting out unduly difficult, to create FOMO (“all of your friends are doing it, don’t miss out!”) or using countdown timers to create false urgency. They aren’t unique to games – you can see examples of them in everything from online adverts to political fundraising requests – but they are prohibited by several different laws, including the Digital Markets, Competition and Consumers Act 2024 (DMCCA) and the EU’s Digital Services Act (DSA). The upcoming Digital Fairness Act (DFA) is only sharpening attention on these practices.
2. UI and prompts
This escalating scrutiny means that studios should tread very carefully with any design features that look like dark patterns.
Activision Blizzard, for example, has come under scrutiny from the Italian Competition Authority for allegedly engaging in “misleading and aggressive practices” in Diablo Immortal and Call of Duty Mobile.
The authority cited possible “deceptive user-interface design aimed at inducing consumers to play more often, extend their gaming sessions and take up promoted offers”. They give the examples of repeated prompts urging players to purchase time-limited items and rewards, and an opaque virtual currency system which allegedly made it difficult for players to understand what the real value of in-game currency was.
It’s also looking into Activision Blizzard’s “aggressive” pre-set control features which automatically default to settings which offer lower protection to underage players (such as allowing in-game purchases, unlimited play time and interaction with other players).
What to do
- Avoid overly-emotive prompts, especially if they give a sense of urgency or social pressure.
- Keep any prompts in-game. Push notifications out of gameplay, for example, should be carefully thought through and avoided if possible. All the more so if you are dealing with a game played by children.
- Make any default settings clearly accessible when starting up the game, avoid defaulting to inappropriate settings and avoid encouraging players to select options which may not be in their best interests.
3. Loot boxes
Loot boxes are once again firmly back under the spotlight, if indeed they ever left it. There is a lot of overlap with the overarching dark pattern restrictions mentioned above, and some recent loot box advertising rulings deserve some particular attention.
Firstly, you should be aware that if there are any loot boxes in your game then you will need to make sure that the game listing itself clearly mentions this.
Kabam, Nexters and My.Games recently fell afoul of this in the UK and Netherlands with Marvel Contest of Champions, Hero Wars: Alliance RPG and Rush Royale: Tower Defense TD.
The listings for Marvel Contest of Champions and Hero Wars: Alliance RPG mentioned “Offers In-App Purchases” but did not specifically highlight loot boxes, which are considered important for consumers to know before downloading a game. As a result, the Advertising Standards Agency (ASA) determined that the listings were misleading, and in the Netherlands a similar ruling was made against My.Games for the Rush Royale: Tower Defense TD App Store listing which also failed to disclose the presence loot boxes.
Secondly, the probability of winning prizes needs to be clear, as My.Games also found out from the Dutch advertising regulatory body, and as Cognosphere found out from the FTC last year with respect to Genshin Impact.
For example, a roulette style wheel which shows prizes in equal segments implies that there is an equal chance of winning each prize. If that is not the case, then this could be deemed misleading and each segment size would need to be adjusted to more accurately reflect the likelihood of a player winning the prize.
Belgium and the Netherlands are declaring loot boxes as gambling or outright banning them.
Lastly, watch out for any loot boxes which give prizes that have any real-world value, as this could risk tipping a loot box into regulated gambling territory.
At a high level, in the UK the chance-related element of loot boxes would need to be combined with real world money going in and going out in order for gambling regulation to be triggered (although there are nuances to this formula, so this over-simplification should be taken with a pinch of salt).
There are even countries like Belgium and the Netherlands taking things a step further and either declaring loot boxes as gambling or outright banning them (even if actual enforcement of this can be rather sporadic).
What to do
- Clearly disclose the presence of any loot boxes in any game ads and listings. Make this disclosure easy to find and visible at the point of any purchase or download decision (such as in a separate line, bolded or in capital letters).
- Do not imply equal chances of winning prizes with no further explanation if probabilities differ. Provide clear signposting for probability information (avoid relying solely on icons like “?”), and make odds accessible within one click, with plain language explaining these odds.
- In its simplest form, studios should avoid the “money in + game of chance + money out” formula.
- Watch out for territories such as Belgium and the Netherlands if you are considering loot boxes for your game.
4. Virtual currencies
Once again dark patterns are at the core of the increasing regulatory scrutiny of virtual currencies.
The eyes of regulators are alert to “drip pricing” (having an attractive headline price upfront, but with incremental additional costs waiting in the wings), difficult cancellation processes and misleading urgency.
When thinking about virtual currencies, remember that at its core virtual currency is a consumer transaction. This means that the golden rule of virtual currencies is transparency. So, here are some tips to bear in mind:
We know this is something that EU regulators are especially keen on, following the release of their CPC Principles. Even though they were non-binding, these principles gave us a clear indication of where the European Commission’s attention is focussed.
The other key risk when it comes to in-game currencies is that fateful combination of real-world value plus an element of chance. This could bring mechanics into gambling-adjacent territory, which game studios will want to be careful to avoid.
What to do
- If in-game currency can be purchased with real-world money, there should be clear disclosure of real-world pricing, including where bundles obscure the actual real-world value of what is being bought.
- There also need to be clear terms in user-friendly language and internal processes regarding refunds, expiry and unused currency.
- Watch out for virtual currencies that can be purchased and spent rapidly, especially if minors can access the game as there are heightened standards around transparency and parental consent. Learn from Activision Blizzard and avoid default settings allowing minors to make in-app purchases.
- Remember, if there is real world money going in and out of the game, watch out for gameplay with an element of chance. You should be looking to close that loop and/or at least remove the element of chance.
Although in the current regulatory climate even if studios do remove one of these elements then they may still be left to grapple with further regulatory headaches, so if in doubt: speak to a lawyer.
As a recent example, even if there is no real money going in or out and the game is designed to be a “closed loop”, as recently highlighted in a judgement about gold pieces in RuneScape, there may still be third party grey markets which could result in your in-game currency having real-world value in the eyes of the law.
5. Player content and communications
Moving on from dark patterns to other forms of online safety: user-to-user communications. Think in-game chat, or any feature which allows players to interact in some form with other players, which could also include a waiting lobby and user-generated content.
If your game has online features, moderation is everything. Gone are the days when studios could wait for players to report bad content and then react. Now it is on the studios to take an active role in moderating harmful user content.
The reporting of bad behaviour should still be allowed and having a clear, easy-to-access reporting tool for players to flag inappropriate content is still essential. Studios should then also have clear internal processes to swiftly deal with any flagged content.
What to do
- Private communication between adults and minors should be prohibited by default.
- If you really need to allow adults and minors to interact, tread carefully, put up all the guardrails you possibly can and talk to a lawyer (I know I would say that but, really, you should).
- Include a way for players to easily turn off in-game chat.
- Studios now also have an active duty to assess the potential risks that their features might pose to minors or other vulnerable players (this is where the Online Safety Act’s risk assessments come into play). So do a risk assessment and then put in place robust mechanisms to mitigate and manage those risks. This is where age verification and age gating often comes into play.
Age verification and gating is a way of mitigating the risk of minors accessing inappropriate content, sure, but this still does not absolve studios of ongoing moderation duties to keep user content legal. Anyone who has been involved in the mighty task of content moderation knows that this is no small ask, most likely requiring AI-assisted moderation tools.
Which, to add a further layer of fun, brings us neatly to the topic of AI.
6. Labelling AI-generated content
While there is a host of ever-shifting AI compliance requirements and best practices behind the scenes (such as clearing your work, checking the T&Cs of any third party tool you are using, mitigations to overcome potential ownership issues, and so on) we will save those delights for another day and another article.
For now, there is an important and impending new labelling requirement for AI-generated assets, in the form of the EU AI Act’s final suite of provisions which come into force in August this year. Although the European Commission is looking to potentially delay this until 2027, so watch this space.
Any video, image or audio content that has been created with deepfake technology will need to be labelled as such. Deepfake-style models are already used in video game development as part of performance capture, voice cloning and localisation, so the question then is how to label it in a way that is not glaringly disruptive to the game.
The AI Act does qualify that if this technology is used as part of a clearly artistic, creative, satirical or fictional product (as would be the case for games) then there is some wiggle room to allow for this labelling to be made in an “appropriate manner that does not hamper the display or enjoyment of the work.”
There is not yet any consensus on the form of these labels. The European Commission is working on releasing a Code of Practice by… June. That is, two months before the provisions come into force. So fingers crossed these labelling requirements are indeed pushed to 2027, as otherwise this will not give developers much time to adopt the Code of Conduct’s recommendations.
But given the above “creative wiggle room rule” (not an official term), we can expect at least that a big flashing sign during cutscenes and gameplay saying “Watch out, this was created with deepfake technology!” is probably not what the European Commission has in mind. Our hope is that an appropriate disclaimer at a suitable point in the game would be sufficient for most games, but we will have to wait for further EC guidance to know for sure.
What to do
- Keep a careful note of any end user-facing content in your game which was created through deepfake technology and, if your game is looking to release any time soon, be ready to add a disclaimer or implement some form of subtle labelling.
- Watch out for further EC guidance in June.
Anna Poulter-Jones is a games lawyer at Sheridans. She advises games studios of all sizes on a variety of commercial, regulatory and intellectual property matters. She is also a Board Trustee of the National Videogame Museum and a member of the Appeals Panel for the Games Rating Authority.
