How to Use MCP to Integrate External Tools in Your Mendix Chatbot | Mendix

Skip to main content

How to Use MCP to Integrate External Tools in Your Mendix Chatbot

Key Takeaways

  • MCP isn’t just for exposing tools, it’s for consuming them too. The new MCP Client lets your Mendix app access tools from other apps, empowering the LLM to decide how and when to use them in conversations.
  • No need to reinvent the wheel. You can reuse existing tools and prompts—locally or from open-source MCP servers like GitHub, Slack, or Google Drive—and plug them into your chat experience without custom integrations.
  • Low code makes AI orchestration feel effortless. Thanks to modules like GenAI Commons, Conversational UI, and MCP Client, wiring up external tools to your chatbot is just a few microflows away.

In my last blog, I showed how you can expose your Mendix microflows as tools that Large Language Models (LLMs) like Claude can discover and invoke via the Model Context Protocol (MCP). That opened the door for AI assistants to tap directly into your Mendix logic. Now, we’re flipping things around.

With the new MCP Client module, your Mendix app can act as a consumer of external MCP servers, meaning your app can discover and invoke prompts and tools running elsewhere. This makes it easy to plug powerful external logic or AI services into your Mendix apps without creating any custom integrations.

What Is the MCP Client Module?

The MCP Client module lets your Mendix app connect to any MCP-compliant server, whether that’s another Mendix app, a high-code-based tool service, or an agent hosted in the cloud.

That means:

  • Reuse logic across apps — have one Mendix app provide tools, and another consume them
  • Connect to third-party AI services—no REST or SDK wrangling required
  • Chain tools and prompts into full GenAI workflows—all from low code

In short, Mendix can now speak both MCP server and client, making it the tool to build powerful AI-enhanced applications. It’s like plugging your Mendix app into a shared toolbox of AI-powered capabilities.

Here’s how that works at a high level for our example:

Please note that while these tools handle data retrieval, MCP tools can just as easily trigger actions, update records, or perform any other operation your app or external system supports.

This shows a Mendix app acting as an MCP Client, connected to an external MCP Server (which could be another Mendix app). When the user interacts with the chatbot UI, the client dynamically discovers tools from the server and includes them in the request to the LLM. If the model chooses to invoke one, the client passes the call to the server, gets the result, and continues the conversation.

Everything runs over standard MCP HTTPS calls, and no custom integrations are required.

Note: The latest version of the GenAI Showcase App shows a step-by-step guide on how to connect an MCP Client in your Mendix app to an external server to consume tools and then use them in your Mendix chatbot.

Connect to an MCP Server from your Mendix app

Prerequisites

You can choose whether to start from scratch or extend an existing application. If you wish to start from scratch, the Blank GenAI App, which already includes all necessary GenAI modules, is the best starting point. If you are extending an existing app, make sure to install the MCP Client, GenAI Commons, Conversational UI and your favorite GenAI connector module from the marketplace. You should also have access to an MCP Server (running in a Mendix app or externally) for which you can also use the GenAI Showcase App MCP Server example.

Create your chatbot

In case you don’t have any chatbot in your application yet, I will guide you step by step on how to create one. Even if you have already implemented one earlier, it would be best to follow these steps to make the changes accordingly.

Step one

There are a couple of example microflows available in the MCP Client module which will help you get started more quickly and that need little customization to work. Go to the module and copy the microflow: ChatContext_MCPClient_ActionMicroflow as well as all three microflows from the Map to GenAI Commons folder to your module.

Step two

Since they’re excluded by default, you’ll need to include them first. This will trigger some errors, which you can fix by reconnecting the copied microflows. Don’t worry—these errors are expected and easy to resolve. Just follow the names and select the respective microflow copy.

Step three

After all errors are gone, add a dataview to your page with a microflow as data source. Create a microflow named DS_ChatContext_Create:

    • Inside of that microflow first retrieve one DeployedModel from the database (or make your custom retrieve to select the right LLM).
    • Afterwards add the New Chat action from the toolbox. Select the retrieved model and for the action microflow, choose the previously included action microflow.
    • Finally return the ChatContext object at the end of the microflow.

Step four

Add a button inside the dataview to open page ConversationalUI_FullScreenChat (or a different ConversationalUI page) from the ConversationalUI module.

Ensure that your user has the module role User from ConversationalUI assigned. Your chat is now ready to be used.

But wait: how does Mendix know which MCP server to use?

How does this technically work?

Registering tools with the request

Let me first explain what you just blindly copied to your own module:

  • The ChatContext_MCPClient_ActionMicroflow makes sure that a message typed by a user is correctly sent to the right model, with all its important configurations and context (such as the conversation history).
  • Inside of that microflow, the Request_AddMCPTools sub-microflow is called. It connects to your MCP Server, discovers all tools that are exposed and adds them as GenAICommons.Tools to the request that is sent to the LLM.
  • For each message that is sent to the model, the tools are known to it and it can choose to call them.

Tool invocation and proxying

How can your Mendix app call a tool in another app? It starts with the MCP tools being registered with a request. This request points to a specific microflow called Tool_OrchestrateToolCall. The microflow knows the tool’s name and the arguments to send because this information is included in the request from the model. Using the MCP Client’s Call Tool action, the microflow forwards the request to the MCP Server, acting as a middleman. The server’s response is then sent to the LLM, which can process the user’s request—either by replying directly or using other tools if needed.

As you see, no custom integrations or changes are needed to make it work for every MCP server. Just plug and play to enrich your chat experience with powerful tools that you don’t need to manage in the same application.

Establish the MCP connection

Now let’s wire it up. Your app needs to know where the MCP server lives.

In the microflows Tool_OrchestrateToolCall and Request_AddMCPTools you need to change the MCPClientConfiguration endpoint to point to your MCP Server. If you need to authenticate by passing custom HTTP Headers, you can use the action Config: Add Http Header afterwards. It’s a good idea to wrap the creation of the MCP Client in a reusable sub-microflow, especially now that it is used in two places.

Don’t have an MCP Server at hand? This might be the moment to revisit the latest blog post about exposing logic from your Mendix app via MCP. You can build your own MCP Server or reuse the example in the GenAI Showcase app. For using it locally from the showcase app, the endpoint likely looks like this: http://localhost:8080/mcp-ticketsystem

Rerun your application and ask a question which can be answered through the available tools. If you’re using the showcase MCP server, a question like “How many tickets are open?” will likely trigger two tool calls checking how many bug and feature tickets are open, the model can use this information to calculate the total amount of open tickets.

What’s next?

Now that your Mendix apps can both expose and consume tools via MCP, you’re ready to build more advanced agentic workflows—chaining applications together, offloading tasks to AI, or integrating with internal microservices across teams. There’s really no excuse anymore: you can create a seamless AI-powered landscape across your enterprise, using low-code.

Even better, there’s a growing ecosystem of open-source MCP servers available for third-party services like GitHub, Slack, or Google Drive. You can self-host these and let your Mendix app plug right in. So—what external service will your Mendix app connect to first?

Choose your language