Regnology

Regnology Chatbot: answering complex regulatory questions

Logo-regnology

As a leading global provider of software for regulatory reporting and management, Regnology is helping both regulators and companies to ensure their regulatory processes run as efficiently as possible.

Find out more
Regnology_International happy professional business team at group office meeting. By insta_photos_2_envato

Catch up quickly

Regnology collaborated with ML6 to develop a production-grade chatbot powered by Retrieval-Augmented Generation (RAG). Built on Google Cloud, the solution ensures scalability, guardrails against misuse, and automated data pipelines for up-to-date responses. The chatbot empowers both support teams and customers with faster, more reliable answers to complex regulatory questions, improving service quality and scalability.

Continue reading

About this client

Regnology is a leading global provider of innovative software solutions for regulatory reporting, tax reporting, and supervisory management. Serving both regulators and financial institutions, Regnology helps organizations navigate increasingly complex compliance requirements while ensuring transparency, accuracy, and efficiency. With a presence across multiple countries and decades of expertise, the company supports a broad customer base in adapting to evolving regulations and scaling their operations securely and effectively.

Impact

The Regnology Chatbot helps answer complex questions in the fast-moving and time-sensitive regulatory world. By answering customer questions faster and with high quality, this solution ultimately accelerates Regnology’s ability to scale their operations and serve more customers.

Advisory_ML6_team

Challenge

The regulatory environment is complex; and this complexity is increasing as Regnology continues to scale globally, integrating new and evolving policies. Helping clients to navigate these frequent updates is a challenge for Regnology’s support team, who need to consult the ever-growing documentation of the platform to answer questions. As regulatory questions are often time-sensitive, this adds to the challenge.

By integrating ML6's Retrieval-Augmented Generation (RAG) expertise, we have significantly enhanced the accuracy and relevance of our responses, ensuring our clients receive precise and up-to-date information. This advanced approach has enabled us to swiftly and seamlessly provide our clients with AI-augmented access to our knowledge base, thereby enhancing their ability to navigate the complex regulatory landscape efficiently
regnology steffen
Steffen DangmannDirector Cloud & AI Engineering, Regnology

Solution

Therefore, Regnology decided to invest in an AI-powered chatbot to support both their internal support team and their customers. The Regnology Chatbot aims at providing high quality responses to these complex regulatory questions, faster.

The solution leverages the Retrieval Augmented Generation (RAG) approach to ensure responses are grounded in the large corpus of Regnology documentation. To build a production-grade application, the project focused on:

  • Scalability

    the Regnology Chatbot needs to be able to serve many internal and external users concurrently, with limited latency.

  • Guardrails

    as an external facing product, it is crucial to catch potential misuse of the application. To mitigate this risk, the team trained custom input guardrail models that are able to detect irrelevant or malicious questions.

  • Data

    Regnology has a fastly evolving, large set of documentation. To ensure trustworthy and up-to-date responses, the team set up automated data pipelines to store the documentation in the search database.

  • Evaluation and monitoring

    to monitor the application’s accuracy, the team installed automated benchmarking based on a ground truth dataset. This allows Regnology to launch iterative improvements in a monitored way. Next to this, other KPIs such as user adoption and feedback rates are continuously monitored

Ml6 Avatar
While best practices exist for customizing LLMs to unique challenges, guiding them to tackle regulatory questions is no simple task. We had to ensure high precision while navigating the complexities of searching in big amounts of unstructured data, handling regulatory content, and custom domain-specific programming languages.
ML6Team

Results

As a result of our focus on building a production-ready application, the Regnology Chatbot has been successfully deployed and has demonstrated its capability to handle a substantial volume of concurrent questions.

Next steps include:

  • Ongoing Quality Monitoring

    Continuously track and improve the chatbot’s accuracy and performance.

  • Product Expansion

    Extend chatbot support to additional Regnology products.

  • Premium Feature Development

    Add advanced capabilities, such as explaining regulatory source code directly.

  • Customer Context

    Enable the chatbot to leverage customer-specific knowledge and data for tailored responses.

This project is the result of a close collaboration between Regnology, Google Cloud & ML6. We worked in a co-creation setting and in a cross-functional team. This brought many benefits, as each of the parties brought crucial experience and knowledge on the Regnology product suite, best practices for building production-ready RAG applications and of course the latest Gen AI capabilities of Google Cloud Platform to support.

Throughout the course of the collaboration, Regnology has further grown their in-house AI team and their capabilities, setting them up for success in their broader AI strategy. As their partner, we are beyond happy to see and support this growth.

Share this on:
Sophie Decock
Sophie DecockClient Director, ML6

Inspired?
Let’s connect and make it happen!

Ready to elevate your AI game? Schedule a meeting with us today and let’s craft a winning strategy together!

Sophie Decock
Sophie DecockClient Director, ML6
Cupcake ipsum dolor sit amet apple pie.

Frequently Asked Questions