No longer a futuristic concept, AI is a present reality and here to stay. From AI-ready laptops to full AI-powered solutions, the sector is evolving fast. Let's explore some trends and legal implications for Luxembourg financial institutions drawing the use of AI in their daily operations.
Key trends in AI for financial institutions
AI is constantly moving beyond novelty: "As soon as it works, no one calls it AI anymore" (as per John McCarthy's exclamation, or the so-called AI effect). Generative AI, with public large language models, has accelerated the shift. Search and summary tools, chatbots, virtual assistants and AML tools are embedded in all business operations and internal processes, from product design, portfolio management and back office to client services through risk management and compliance. These are no longer innovations. Some of the most significant AI trends, also observed by Big Tech such as IBM (see What is artificial intelligence in finance, IBM) and Google (see AI trends 2025, Financial Services, Google), include:
- Advanced generative AI & large reasoning models (LRMs) for performing complex analytics and scenario simulations e.g. for portfolio optimization and forecasting or support functions like transaction monitoring, stress tests, payment validation screening and fraud detection.
- Agentic AI refers to coordinated autonomous AI agents fully managing complex workflows: reason, plan, execute and remember e.g. cash flow forecasting, and compliance monitoring and reporting.
- Multimodal AI are systems that learn from unprecedented data sources, delivering more accurate and customizable outputs.
- Assistive search helps ensure key employees receive tailored data for their specific roles.
- AI-powered operating systems provide seamless integration of AI into core infrastructure devices.
On why AI has become essential in the Luxembourg financial sector, Gary Cywie, Partner at Elvinger Hoss Prussen says the following: "AI is no longer a futuristic concept. It is sparking changes in all processes, internally and externally. From regulatory compliance like for anti-money laundering with assisted transaction monitoring and fraud detection, to portfolio management with complex analytics and simulation."
AI powers the smart, personalised interfaces that clients now expect. That is why AI is essential for Luxembourg's financial institutions: to work more efficiently and to stay competitive.
Gary Cywie
Partner
With these trends, the focus is shifting from a fragmented approach with isolated AI tools across businesses to comprehensive, global AI strategies that orchestrate systems, agents, and processes for end-to-end business transformation. This means that businesses would focus on:
- Integrated AI environments design tailored to their models, interconnecting ICT systems and applications, existing AI tools and agents to manage entire workflows.
- Robust controls from the deployment phase, to ensure responsible AI use.
- Custom AI solutions development internally or with external developers to ensure alignment with their unique models, needs and processes (according to the Second thematic review on the use of Artificial Intelligence in the Luxembourg financial sector survey conducted by the CSSF and BCL, 60% of the reported AI use cases are developed internally, with and without external support).
As also discussed at the Elvinger Hoss 2025 Annual Fintech Conference which took place on 16 September 2025, financial institutions should set out a digital strategy and define their risk appetite in relation to the use of AI.
Gary Cywie adds: "AI powers the smart, personalised interfaces that clients now expect. That is why AI is essential for Luxembourg's financial institutions: to work more efficiently and to stay competitive."
Regulatory aspects: what you really need to know
While AI is transforming the way we work, the legal and regulatory framework in Luxembourg remains technology neutral. Whether a process is performed with a traditional ICT tool or an AI solution, similar principles would apply.
Anaïs Sohler, Partner at Elvinger Hoss Prussen, says the following onthe main regulatory challenges associated with the use of AI: "Financial institutions must not only apply the same compliance principles regardless of the AI tool used but must also adapt their processes to the technology. For example, they must tailor their due diligence questionnaires to request from their provider the model documentation or information about the data used to train the model."
However, there are some important points to consider:
- No uniform terminology: the language around AI is still evolving, which can create confusion. Financial institutions should focus on understanding the function and impact of the tools they use and how they are developed, rather than commercial labels, noting that many traditional ICT tools today contain an AI component by default.
- "Simple AI" is no longer innovative: Financial institutions should consider defining a comprehensive AI strategy for multi-layered AI integration in their business models and aim at consistent improvement of operational efficiencies, rather than focusing on short-term simplification of daily tasks.
- Paradigm shift, but stable rules: adopting AI is not just about using a new tool – it will change how work is organised. Yet, from a compliance perspective, the core rules and principles (such as IT system compliance) remain unchanged and must be complied with. New interpretation, application and technology challenges may arise, but the core financial sector regulation does not as such depend on the specific type of ICT solution used.
- Understanding AI risks: the Luxembourg regulatory framework often applies a risk-based approach. In this context, even if the main categories of ICT risks remain the same whether AI is used or not (e.g. confidentiality risks, security risk, concentration risk), it is expected that financial institutions understand and manage the AI-specific risks involved, as well as the potential amplification of existing risks. This requires appropriate understanding of the AI technology used, even when relying on (group) ICT third-party service providers.
Regarding how financial institutions must manage the risks associated with AI, Anaïs Sohlerconcludes: "It is key that financial institutions understand and control AI specific risks, even when simply buying off-the-shelf tools. This means maintaining a clear overview of the technologies used and integrating AI into a comprehensive risk management framework. For example, managing ICT security risks should include ensuring that it is not possible to inject prompts or manipulate AI output."
It is key that financial institutions understand and control AI specific risks, even when simply buying off-the-shelf tools.
Anaïs Sohler
Partner
For further inspiration, see some key questions you must consider when adopting AI tools in the financial sector at https://elvingerhoss.lu/insights/publications/ai-strategy-considerations-financial-institutions
The article was originally published on the Paperjam website.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.