The Malta Gaming Authority (MGA) has confirmed it is drafting what it expects to become the first dedicated artificial intelligence governance framework designed specifically for gaming operators. The initiative comes as the forthcoming EU AI Act begins to reshape expectations around how advanced technologies are designed, deployed and supervised across regulated industries, including gambling.
For the gambling sector, where AI is increasingly embedded in everything from player support and fraud detection to marketing and risk management, the timing is deliberate. Operators are already under pressure to demonstrate compliance with evolving gambling regulations and responsible gambling standards, and the arrival of a comprehensive EU-wide AI regime adds another layer of complexity.
While the MGA’s framework will be voluntary, the regulator has positioned it as a practical tool rather than a conceptual policy exercise. It is intended to give licensees clarity at a point when AI systems are moving rapidly from experimentation into core operational use, while helping bridge the gap between today’s AI deployments and tomorrow’s EU regulatory obligations.
Introducing the initiative, the MGA explained that the framework is designed to provide a clear reference point for what responsible AI looks like in a gaming context. “As AI becomes increasingly embedded in operational, compliance and player-facing tools, it is essential that there is no ambiguity for operators about what good practice means in real terms,” MGA CEO Charles Mizzi says.
Why Malta is driving AI best practice
The MGA’s move carries significance as it oversees a jurisdiction that hosts a large concentration of Europe’s leading online operators, suppliers and platform providers. Malta’s licence remains a critical gateway to regulated markets, recognised by banks, investors and counterpart regulators alike.
That influence means regulatory direction set in Malta often informs broader industry behaviour. Decisions taken by the MGA can shape how operators interpret gambling regulations, structure compliance programmes and embed responsible gambling safeguards across multiple jurisdictions.
Crucially, the framework is being developed collaboratively rather than imposed in isolation. Input from licensees, workshops, surveys and case studies – alongside cooperation with the Malta Digital Innovation Authority – are shaping guidance intended to reflect real-world operational constraints.
AI’s growing role across the industry has also introduced new categories of risk that traditional gambling regulations were not designed to address. “AI presents new risks – from biased outcomes to opaque decision-making – that could affect both players and the integrity of gaming,” Mizzi explains. “By taking the lead, the MGA can help ensure AI is used responsibly, reinforcing trust while protecting the sector.”
Balancing opportunity and player protection
Few technologies have generated as much optimism within the gambling industry as AI. Machine learning models can improve fraud detection, reduce false positives in AML monitoring and help identify early signs of gambling-related harm. At the same time, poorly governed systems risk amplifying bias, enabling intrusive profiling or encouraging harmful play patterns.
For regulators, player protection remains the defining concern. Automated personalisation, behavioural analytics and real-time decisioning raise fundamental questions around fairness, transparency and accountability – all core pillars of responsible gambling.
The MGA’s position is that innovation is welcome, but only where outcomes are clearly aligned with player welfare and financial integrity. “We regulate for outcomes, not headlines,” Mizzi says. “AI is acceptable where it makes players safer and strengthens oversight, but it becomes unacceptable the moment it nudges vulnerability or obscures accountability.”
This thinking underpins the structure of the proposed AI Governance Framework. It is built around principles such as transparency, fairness, data protection, system robustness and clearly defined human oversight, all of which are intended to apply across a wide range of AI-driven use cases.
Human involvement remains central, particularly in high-impact decisions affecting players. “Where AI informs interventions or player protection measures, documented human review is essential to prevent unintended harm and maintain accountability,” Mizzi says.
Aligning with the EU AI Act
A defining feature of the MGA’s approach is its early alignment with the EU AI Act, which introduces a risk-based framework for regulating artificial intelligence across the bloc. Although the legislation applies horizontally across sectors, its implications for gambling are substantial.
Systems used for behavioural profiling, fraud detection, financial monitoring or player risk assessment are all likely to attract heightened scrutiny, particularly where they influence consumer outcomes or financial decision-making.
From the regulator’s perspective, early alignment offers operators a way to prepare rather than react. “From the outset, the framework has been mapped to the EU AI Act’s risk-based structure and core principles,” Mizzi explains. “That gives operators clarity and helps avoid the need to retrofit systems later.”
The MGA expects the most acute compliance challenges to emerge over the next 12–24 months as the Act moves from legislation into practical enforcement. Requirements around documentation, bias testing, model monitoring and traceability will be particularly demanding for operators using complex or third-party AI systems.
Third-party governance is another pressure point. While many operators rely on external vendors for AI capabilities, accountability under both gambling regulations and the EU AI Act remains with the licensee, driving the need for stronger contractual controls, transparency provisions and audit rights.
“Where AI systems fall into higher-risk categories, operators will need strong data governance, ongoing monitoring and genuine human oversight,” Mizzi says.

The value of voluntary action
In this context, the MGA’s decision to introduce a voluntary AI Governance Framework is a strategic one. Rather than waiting for prescriptive rules, the regulator is encouraging early engagement and shared ownership of emerging standards.
For operators, participation offers more than reputational benefit. Early alignment with the framework is expected to reduce disruption as EU requirements take effect, while also providing a platform to influence how responsible AI is defined in practice.
“Voluntary, for us, means creating space to lead,” Mizzi shares. “Operators who engage early are helping shape standards rather than reacting to fixed requirements later.”
In a market where scrutiny of AI-driven decision-making is intensifying, transparency is becoming inseparable from trust. Demonstrating how AI systems operate, how risks are managed and how player interests are safeguarded can support stronger relationships with regulators and customers alike. “Those who adopt the framework meaningfully set the tone for responsible innovation and build trust with players, partners and regulators,” Mizzi says.
Using AI to strengthen supervision
While the voluntary code is written for licensees, the MGA is also applying AI internally, developing an implementation roadmap for 2026–2027 that focuses on supervisory functions such as AML, player support and financial compliance. The aim is to improve efficiency and consistency while reinforcing safeguards.
In AML and financial crime supervision, AI tools are already showing value by analysing large transaction datasets and identifying anomalies more effectively than manual processes. “These systems allow us to focus resources on genuinely higher-risk activity rather than noise,” Mizzi explains.
In responsible gambling oversight, AI is being explored to assess how licensees’ policies align with regulatory requirements, while automation in financial compliance can reduce manual error and accelerate reporting cycles.
Taken together, these initiatives reflect a deliberate balance between innovation and safeguarding, with internal experience feeding back into external guidance on gambling regulations and responsible gambling expectations.
Mapping AI use across the industry
Alongside the framework, the MGA plans to launch an industry-wide initiative to better understand how AI is currently used – and how it is likely to be used in the future – across licensed operations. The objective is to inform proportionate, future-proof regulation rather than impose one-size-fits-all rules.
“This is about building a shared picture of how AI is transforming gaming today and where it is heading,” Mizzi says. “That insight allows regulatory expectations to evolve in step with real-world practice.”
AI literacy sessions will complement this work, helping operators navigate the EU AI Act and address common gaps around governance, bias and explainability. “Without a shared understanding, even well-intentioned operators can make mistakes,” Mizzi shares. “Education turns compliance into collaboration.”
Shaping the trajectory
As AI adoption accelerates, the stakes for the gambling industry continue to rise. Used responsibly, the technology can strengthen player protection, enhance compliance and support responsible gambling objectives. Used poorly, it risks eroding trust and inviting regulatory intervention.
By moving early, the MGA is attempting to shape that trajectory rather than react to it. “Our aim is for Malta to be a jurisdiction that leads responsibly,” Mizzi says. “This approach allows innovation to develop safely while ensuring we can monitor, guide and safeguard the sector as AI evolves.”
For operators, the message is clear: the next 12–24 months will be critical. Engaging early with the MGA’s AI Governance Framework may prove decisive in navigating a rapidly tightening regulatory landscape – and in demonstrating that innovation and responsibility can advance together.

Malta Gaming Authority’s (MGA) CEO Charles Mizzi
With the EU AI Act moving steadily towards implementation, the Malta Gaming Authority is preparing to publish the gaming industry’s first dedicated AI governance framework. In this extended interview, Malta Gaming Authority’s (MGA) CEO Charles Mizzi outlines why the regulator is taking the lead, the compliance challenges operators face in the next year, and how AI can be deployed responsibly without undermining player trust.