Malta Judge Cites AI Chatbot in Rent Dispute Ruling, Raising Questions About Fair Trials

Politics,  Tech
Malta courthouse judge's chamber with AI technology integration representing judicial modernization
Published February 21, 2026

Malta Judiciary Uses AI in Judgment: What a Footnote Reveals

A recently surfaced footnote in a Malta civil court judgment reveals that Judge Giovanni Grixti has directly referenced an artificial intelligence chatbot in a formal legal ruling—marking the first documented use of AI in a published Maltese judicial decision. In a rent dispute involving a Balluta apartment, the judge cited Gemini, Google's AI assistant, to determine the minimum wage in 1987, a figure that is critical under Malta's unique rent control system.

Why This Matters for Malta Residents

Malta has one of Europe's strictest rent control regimes. Under the Protected Tenancies Act, apartments leased before 1995 are subject to strict rent caps tied to historical wage data. If you live in a protected tenancy—common in areas like Sliema, St. Julian's, and Msida—the maximum rent your landlord can charge is calculated using 1987 minimum wage figures. A miscalculation could mean the difference between a €400/month ceiling and a €600/month ceiling, a gap that compounds significantly over time.

The judge's decision to consult an AI tool rather than official archival records raises an urgent question: How accurate is machine-generated historical data in binding legal decisions?

What the Judgment Reveals

Judge Grixti's decision centered on a rent dispute governed by pre-1995 lease legislation. Rather than consulting archival records or the National Statistics Office, the judge turned to Gemini to retrieve the 1987 minimum wage figure—a detail that determines whether the landlord can lawfully increase rent under the Protected Tenancies Act.

The footnote disclosing the AI consultation is tucked into a multi-page judgment but represents a significant departure from traditional judicial method. Malta's courts have historically relied on paper archives, official government records, and legal databases for factual verification. The shift to an external, self-learning algorithm for a binding calculation introduces a new variable: the reliability of machine-generated data in adversarial proceedings.

The Broader Risk: Unverified AI in Court Decisions

For Malta's legal community and residents facing court cases, this ruling opens important questions. If a judge can cite an AI chatbot without corroborating the output against official government records, so can opposing counsel—but at their peril. Legal professionals in multiple jurisdictions, including the United States and the United Kingdom, have faced sanctions and contempt charges for submitting AI-generated case citations that turned out to be fabricated, a phenomenon known as "hallucinations."

Malta's Legal Profession Act does not yet address AI use in court filings, leaving advocates without clear guidelines on disclosure or verification. The Chamber of Advocates has not yet issued formal guidance on the matter.

The practical risk for tenants and landlords is inconsistent rent calculations. If one judge uses Gemini, another consults ChatGPT, and a third relies on official statistics, each may arrive at a different 1987 wage figure. This variability could trigger appeals and undermine legal certainty—the principle that identical cases should yield identical outcomes.

Malta's AI Justice Initiative

The Grixti ruling is not an isolated incident. The Ministry for Justice, in partnership with the University of Malta's Faculty of Law, has launched initiatives to test AI-powered legal research tools. The Small Claims Tribunal, which handles disputes under €5,000, is piloting an AI system that scans written submissions, identifies relevant precedents, and flags contradictions—tasks that currently consume significant time per case.

Malta's government has indicated plans to develop Practice Directions for the judiciary that would mandate:

Disclosure of AI use in all judgments where the tool influenced factual findings or legal conclusions

Prohibition on confidential data input: Judges may not enter party names, case numbers, or unpublished evidence into public AI systems

Guidelines ensuring human verification of all AI-generated data before inclusion in rulings

The EU AI Act: New Rules Coming

Beginning August 2026, the EU AI Act will classify judicial AI systems as "high-risk," subjecting them to mandatory transparency disclosures, data governance audits, and human oversight requirements. Malta has already begun transposing the Act into domestic law.

Under the new regime, any AI tool used to "influence the outcome of judicial decisions" must:

Maintain an audit trail of every query and output

Display a clear warning when results are based on probabilistic inference rather than verified data

Allow parties to request human-only review if they believe an AI output affected their case

What This Means for Your Court Case

If you are involved in a Malta court case—particularly a rent dispute under the Protected Tenancies Act—consider asking your lawyer these practical questions:

Is AI being used in my case? You have a right to know if an AI tool influences factual findings or calculations that affect the outcome.

Was the AI output verified? Ask whether the judge or your opponent's counsel corroborated any AI-generated information against official sources.

Can I request human-only review? While not yet formally codified in Malta's Code of Organization and Civil Procedure, fair trial rights may entitle you to this protection.

For tenants in protected tenancies, transparency about wage calculation methods is especially critical—a miscalculation based on unverified AI data could unfairly lock you into years of incorrect rent payments.

The Verification Gap

Critically, Judge Grixti's ruling does not disclose the exact prompt submitted to Gemini, nor does it specify whether the AI's answer was cross-referenced against official records. The judgment states only that the tool was "consulted for historical wage data"—a phrase that leaves significant room for interpretation.

Legal scholars argue that any AI-assisted factual finding should be accompanied by evidence showing corroborating sources. This standard is beginning to emerge in other jurisdictions as courts grapple with the same challenge: Can judges trust algorithmic outputs without independent verification?

A Turning Point for Malta's Courts

The Balluta apartment dispute, seemingly routine on its face, has become a test case for a legal question affecting all EU courts: Can judges outsource factual research to AI systems while preserving the right to a fair trial? For Malta residents—especially those navigating rent disputes, tenant rights, or any case involving historical data or calculations—the answer matters profoundly.

For now, the judgment stands. But the precedent is set, and both residents and their legal advisors must understand the implications: AI is now present in Malta's judicial system, and transparency about when and how it is used remains urgently incomplete.

The Malta Post is an independent news source. Follow us on X for the latest updates.