LangChain4j is the leading framework-agnostic Java library for LLM integration — supporting 20+ model providers, 30+ vector stores, RAG, tool calling (including MCP), and agents, without requiring Spring or any specific web framework.
Why It's in Trial
LangChain4j is the right choice when:
- You're not on Spring Boot (Quarkus, Micronaut, plain Java, Jakarta EE)
- You want more explicit control over how AI components are wired together
- You're building agents with complex tool execution loops
- You want to mix and match providers and stores without Spring's auto-configuration
It launched in early 2023 and is now at version 1.x with a stable API. The Fall 2025 Java AI comparison describes it as a "two-horse race" between Spring AI and LangChain4j — both are production-ready, with LangChain4j winning on flexibility and agent maturity.
Key Capabilities
AI Services — declarative interface pattern:
interface CustomerSupport {
@SystemMessage("You are a helpful customer support agent for ACME Corp.")
String chat(String userMessage);
}
CustomerSupport support = AiServices.builder(CustomerSupport.class)
.chatLanguageModel(openAiChatModel)
.chatMemory(MessageWindowChatMemory.withMaxMessages(10))
.tools(new OrderLookupTool())
.build();
The AiServices pattern is LangChain4j's standout feature — define an interface, get an LLM-backed implementation. This is cleaner than Spring AI's ChatClient fluent API for service-oriented code.
Tool calling: Define @Tool-annotated methods on any POJO. LangChain4j handles the function-call loop automatically — the model can call your tools, receive results, and continue reasoning.
class CalendarTool {
@Tool("Get the user's upcoming calendar events")
List<Event> getCalendarEvents(String userId, int daysAhead) {
return calendarService.getEvents(userId, daysAhead);
}
}
MCP support: LangChain4j 1.x supports MCP tool calling — any MCP server can be used as a tool source, giving access to the full ecosystem of Stripe, Figma, Vercel, Postgres MCP servers without custom integration.
Broad provider support: 20+ LLM providers, 30+ embedding/vector stores. Switching providers means swapping one ChatLanguageModel implementation — all higher-level code stays the same.
RAG: Full ingestion pipeline (document loading, splitting, embedding) and retrieval with metadata filtering, re-ranking, and query expansion.
Agent Observability: AgentListener and AgentMonitor (added in 1.10.0) provide observability hooks for monitoring agent execution in production.
Agentic A2A modules (v1.3.0): langchain4j-agentic and langchain4j-agentic-a2a are now first-class modules — not experimental. The agentic module provides workflow patterns (sequential, loop, parallel, conditional) and a supervisor pattern for dynamic agent routing via AgenticScope. The A2A module adds @A2AClientAgent for calling remote A2A servers. A companion langchain4j-agentic-mcp module wraps MCP tools as non-AI agents. Both Red Hat and Microsoft back the project, with Microsoft reporting hundreds of customers in production.
Spring AI vs LangChain4j: When to Use Which
| Scenario | Use |
|---|---|
| Existing Spring Boot app | Spring AI |
| Quarkus or Micronaut app | LangChain4j (via native extensions) |
| Plain Java or Jakarta EE | LangChain4j |
| Need complex multi-tool agents | LangChain4j |
| Prefer annotation-driven DI style | Spring AI |
| Prefer explicit, testable interfaces | LangChain4j |
Getting Started
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
<version>1.0.0</version>
</dependency>
Key Characteristics
| Property | Value |
|---|---|
| Status | 1.x stable |
| Requires | Java 11+ |
| Framework | None required (Spring, Quarkus, Micronaut, plain Java) |
| MCP support | Yes (tool calling) |