Technology RadarTechnology Radar

Quarkus LangChain4j

langchain4jgraalvm
Trial

Quarkus LangChain4j is the first-class LangChain4j integration for Quarkus — declarative AI services, native image support, and Quarkus-native developer experience, including live reload and Dev Services for local model testing.

Why It's in Trial

If you're a Quarkus shop, this is your Spring AI equivalent. It wraps LangChain4j's core capabilities in Quarkus-idiomatic packaging:

  • Quarkus CDI beans for AI services
  • @RegisterAiService annotation for declarative AI service definition
  • Dev Services that automatically spin up Ollama containers for local development
  • Full GraalVM native image compilation — your AI-powered service can compile to a native binary with <50ms startup

The Declarative AI Service Pattern

The @RegisterAiService annotation is the key abstraction. You declare what you need and Quarkus wires it up:

@RegisterAiService(tools = DatabaseQueryTool.class)
@ApplicationScoped
public interface DataAnalyst {

    @SystemMessage("""
        You are a data analyst. Use the database query tool to answer questions.
        Always explain your reasoning before presenting results.
        """)
    String analyzeData(String question);
}

CDI injection from there:

@Inject DataAnalyst analyst;

String result = analyst.analyzeData("Which products had the highest return rate last quarter?");

Dev Services: Zero-Config Local Development

Quarkus Dev Services automatically starts an Ollama container when no model configuration is detected — you get a running local LLM for development with no manual setup:

# For production, set a real model endpoint
# quarkus.langchain4j.openai.api-key=${OPENAI_API_KEY}

# For dev, nothing needed — Quarkus starts Ollama for you

This is a significant developer experience win over Spring AI's equivalent, which requires manual Testcontainers setup.

Native Image

Quarkus + LangChain4j compiles to a GraalVM native binary:

./mvnw package -Pnative
./target/my-ai-service-runner  # Starts in ~20ms

For AI services that must start fast (lambdas, edge deployments), this is a material advantage.

Supported Models and Stores

Inherits all of LangChain4j's provider and vector store support, with Quarkus-specific configuration and CDI wiring. OpenAI, Azure OpenAI, Anthropic, Ollama, Hugging Face — all via @Inject-able beans.

Key Characteristics

Property Value
Status Active (Quarkiverse extension)
Requires Quarkus 3.x, Java 17+
Native image Yes (GraalVM)
Dev Services Automatic Ollama container
Based on LangChain4j core