Decision Intelligence Platform

Decision Services as MCP Servers

The Model Context Protocol (MCP), an open standard, has become a widely adopted mechanism for exposing decision models as AI-friendly services across many decision intelligence platforms. OpenRules supports MCP-based deployment, allowing decision models to be published as MCP Servers. We provide several examples illustrating how both rules-based and optimization-based decision models can be converted into MCP Servers capable of communicating with LLMs.

The “LoanMCP” and “PatientTherapyMCP” decision model examples are included in the standard OpenRules installation under “openrules.samples/AI” and can serve as useful references.

Unlike the default deployment of OpenRules decision models as AI Agents — which requires no changes — converting a model to an MCP Server requires a certain degree of configuration. Specifically, the following steps are needed:

  1. Add “mcp=http” to the “project.properties” file.
  2. Update “pom.xml” with the “openrules-mcp” dependency and a special plugin.
  3. Package the project by running the standard “package.bat”.
  4. Run the service-specific “runServer.bat”.

You will also need to register your running MCP Server’s name and URL in the LLM’s settings, then initiate the dialogue by passing a service-specific prompt guide. This guide can be created manually or derived in part from the automatically generated “description.md” file.

LLM Integration
Overview
Rules-based AI Agents:
Patient Therapy
Vacation Days
Optimization AI Agents:
Loan
Inside/Outside Production
MCP Servers:
LoanMCP
PatientTherapyMCP
OpenRules Blog