Integrating LangChain4j with mcp.run
This tutorial guides you through connecting your LangChain4j application with mcp.run's tool ecosystem. You'll learn how to create a chat interface that can interact with external tools and APIs through mcp.run.
LangChain4j provides flexible AI model integration. While we will walk through using OpenAI in this guide, you can easily adapt these instructions for other models supported by Spring AI's chat model interface.
Find complete source code for both OpenAI and Ollama implementations in the mcpx4j repository examples directory. The Java code remains largely the same between versions, with changes only needed for specific model configurations and dependencies.
For a video walkthrough:
Prerequisites
Before starting, ensure you have the following:
- A JDK installed on your system. We recommend using SDKMAN! for the installation
- A GitHub Account for mcp.run authentication
Additionally, if you want to use OpenAI as the LLM:
- An OpenAI Account with API access
- An OpenAI API Key
Setting up mcp.run
You'll need an mcp.run account and session ID. Here's how to get started:
- Run this command in your terminal:
npx --yes -p @dylibso/mcpx@latest gen-session
- Your browser will open to complete authentication through GitHub
- After authenticating, return to your terminal and save the provided session key
Keep your mcp.run session ID and OpenAI API key secure. Never commit these credentials to code repositories or expose them publicly.
Required Tools
This tutorial requires two mcp.run servlets:
Install both servlets by:
- Visiting each URL
- Clicking the
Install
button - Verifying they appear in your install profile
Project Setup
Create a new LangChain4j project. In our case we will write this example using Java and Maven but you can use your favorite JVM language, with your build tool of choice.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.dylibso.mcpx4j.examples</groupId>
<artifactId>langchain4j-openai</artifactId>
<version>999-SNAPSHOT</version>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<mcpx4j.version>0.1.0</mcpx4j.version>
<langchain4j.version>1.0.0-alpha1</langchain4j.version>
<smallrye-config.version>3.10.1</smallrye-config.version>
<slf4j-simple.version>2.0.13</slf4j-simple.version>
</properties>
<!-- MCPX4J is currently available on JitPack -->
<repositories>
<repository>
<id>jitpack.io</id>
<url>https://jitpack.io</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>com.github.dylibso.mcpx4j</groupId>
<artifactId>core</artifactId>
<version>${mcpx4j.version}</version>
</dependency>
<!-- optional for flexible configuration -->
<dependency>
<groupId>io.smallrye.config</groupId>
<artifactId>smallrye-config</artifactId>
<version>${smallrye-config.version}</version>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- in this example we will use OpenAI -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>${slf4j-simple.version}</version>
</dependency>
</dependencies>
</project>
Define the Configuration Parameters
Define src/main/resources/application.properties
as follows:
mcpx.api-key=${MCP_RUN_API_KEY}
mcpx.base-url=https://www.mcp.run
mcpx.profile-id=default
openai.api-key=${OPENAI_API_KEY}
We use SmallRye Config to parse and
interpolate environment variables automatically. The system will look for
OPENAI_API_KEY
and MCP_RUN_SESSION_ID
in your environment variables. You can
either let it use these environment variables or directly replace the
placeholders ${OPENAI_API_KEY}
and ${MCP_RUN_SESSION_ID}
with your actual
values.
Creating the mcp.run ToolExecutor
Tools in LangChain4j are exposed using the ToolExecutor
interface. We can
adapt an McpxTool
as follows:
package com.dylibso.mcpx4j.examples;
import com.dylibso.mcpx4j.core.McpxTool;
import dev.langchain4j.agent.tool.ToolExecutionRequest;
import dev.langchain4j.service.tool.ToolExecutor;
import java.util.logging.Logger;
public class McpxToolExecutor implements ToolExecutor {
private final McpxTool tool;
public McpxToolExecutor(McpxTool tool) { this.tool = tool; }
@Override
public String execute(ToolExecutionRequest toolExecutionRequest, Object o) {
Logger.getLogger(tool.name())
.info("invoking Wasm MCP.RUN function, " +
toolExecutionRequest.toString());
// Splice the function invocation into the JSON representation
// the MCPX tool expects.
String adapted = """
{
"method":"tools/call",
"params": {
"name": "%s",
"arguments" : %s
}
}""".formatted(
toolExecutionRequest.name(),
toolExecutionRequest.arguments());
return tool.call(adapted);
}
}
Create A Main Entry Point
Create an Application class to wire everything together:
package com.dylibso.mcpx4j.examples;
import com.dylibso.mcpx4j.core.Mcpx;
import com.dylibso.mcpx4j.core.McpxServlet;
import com.dylibso.mcpx4j.core.McpxTool;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import dev.langchain4j.agent.tool.ToolSpecification;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.chat.request.json.JsonObjectSchema;
import dev.langchain4j.model.chat.request.json.JsonSchemaElement;
import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.tool.ToolExecutor;
import io.smallrye.config.SmallRyeConfig;
import org.eclipse.microprofile.config.ConfigProvider;
import java.io.IOException;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
public class LangChain4jOpenAIMcpx4jMain {
public static void main(String[] args) throws IOException {
// We use SmallRye config to read a property file
// that contains environment variable placeholders.
// You can use your favorite configuration library.
SmallRyeConfig config =
ConfigProvider.getConfig().unwrap(SmallRyeConfig.class);
String apiKey = config.getValue("mcpx.api-key", String.class);
String baseUrl = config.getValue("mcpx.base-url", String.class);
String profileId = config.getValue("mcpx.profile-id", String.class);
String openAiApiKey = config.getValue("openai.api-key", String.class);
// Instantiate a new Mcpx client with the configuration values.
var mcpx = Mcpx.forApiKey(apiKey).withBaseUrl(baseUrl).withProfile(profileId).build();
// Refresh the installed servlets definitions from mcp.run.
// This will load the configuration once.
// You can schedule this invocation periodically to refresh
// such configuration.
mcpx.refreshInstallations();
// Instantiate each servlet and expose it as a
// `ToolSpecification`, `ToolExecutor` pair.
var servlets = mcpx.servlets();
Map<ToolSpecification, ToolExecutor> tools =
toolsFromMcpxServlets(servlets);
ChatLanguageModel chatLanguageModel =
OpenAiChatModel.builder()
.apiKey(openAiApiKey)
.modelName("gpt-4o-mini")
.build();
var services = AiServices.builder(Chat.class)
.chatLanguageModel(chatLanguageModel)
.tools(tools)
.chatMemory(MessageWindowChatMemory.withMaxMessages(200))
.build();
System.out.println("Chat started. Type 'exit' to quit.");
while (true) {
String input = System.console().readLine("YOU: ");
if (input.equals("exit")) {
System.out.println("Goodbye!");
break;
}
if (input.isBlank()) {
continue;
}
System.out.println("ASSISTANT: " + services.send(input));
}
}
private static Map<ToolSpecification, ToolExecutor> toolsFromMcpxServlets(
Collection<McpxServlet> servlets) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
Map<ToolSpecification, ToolExecutor> tools = new HashMap<>();
for (McpxServlet servlet : servlets) {
for (McpxTool tool : servlet.tools().values()) {
JsonNode schema = mapper.readTree(tool.inputSchema());
JsonSchemaElement jsonSchemaElement =
ToolSpecificationHelper.jsonNodeToJsonSchemaElement(schema);
ToolSpecification spec = ToolSpecification.builder()
.name(tool.name())
.description(tool.description())
.parameters((JsonObjectSchema) jsonSchemaElement)
.build();
tools.put(spec, new McpxToolExecutor(tool));
}
}
return tools;
}
interface Chat {
// Define system prompt for AI behavior
// Note: this can be whatever you want, but it's recommended to give the LLM
// as much context as you can here while remaining generic for your use case.
@SystemMessage("""
You are a helpful AI assistant with access to various external tools and APIs. Your goal is to complete tasks thoroughly and autonomously by making full use of these tools. Here are your core operating principles:
1. Take initiative - Don't wait for user permission to use tools. If a tool would help complete the task, use it immediately.
2. Chain multiple tools together - Many tasks require multiple tool calls in sequence. Plan out and execute the full chain of calls needed to achieve the goal.
3. Handle errors gracefully - If a tool call fails, try alternative approaches or tools rather than asking the user what to do.
4. Make reasonable assumptions - When tool calls require parameters, use your best judgment to provide appropriate values rather than asking the user.
5. Show your work - After completing tool calls, explain what you did and show relevant results, but focus on the final outcome the user wanted.
6. Be thorough - Use tools repeatedly as needed until you're confident you've fully completed the task. Don't stop at partial solutions.
Your responses should focus on results rather than asking questions. Only ask the user for clarification if the task itself is unclear or impossible with the tools available.
""")
String send(String msg);
}
}
Finally, create the ToolSpecificationHelper
utility. We omit the full source
code for brevity, you can find it in the
mcpx4j repository.
Running the Application
-
If in the previous step you have decided to use environment variables, set your environment variables:
export OPENAI_API_KEY="your-openai-key-here"
export MCP_RUN_SESSION_ID="your-mcp-session-here" -
Start the application:
mvn package exec:java -Dexec.mainClass="com.dylibso.mcpx4j.examples.LangChain4jOpenAIMcpx4jMain"
Testing the Integration
Try this example prompt to test the tool chaining capability:
I want to know what would happen if i put the string "Hello, mcp.run!" into this hash function https://gist.githubusercontent.com/MohamedTaha98/ccdf734f13299efb73ff0b12f7ce429f/raw/ab9593d5195a1643388cfc99d03a4fd96a094a5c/djb2%2520hash%2520function.c
The assistant will automatically:
- Use
fetch
to download the C code - Translate the C code to JavaScript
- Execute the translated code using
eval-js
- Return the hash value:
-2106085175
You should see output similar to this:
Assistant: The result of hashing the string "Hello, mcp.run!" using the provided djb2 hash function is `-2106085175`.
This demonstrates how the assistant can chain multiple tools together without explicit instructions, showcasing the power of the integration.
Support
If you get stuck and need some help, please reach out! Visit our support page to learn how best to get in touch.