The Azure AI SDK for Java (azure-ai-openai, azure-ai-inference) provides native Java access to Azure OpenAI and Azure AI Model Inference — the right choice for teams that must route LLM traffic through Azure for compliance, managed identity, or enterprise billing reasons.
Why Assess
Both azure-ai-openai and azure-ai-inference remain in beta (1.0.0-beta.16 as of early 2026). The SDK works, but the beta status means breaking changes remain possible. For Spring Boot teams, the Spring AI Azure OpenAI starter is a cleaner path and reaches GA with Spring AI. Assess the raw SDK when you have tight Azure-specific requirements that Spring AI doesn't cover.
When to Use This Instead of Spring AI
| Scenario | Use Azure SDK directly | Use Spring AI Azure starter |
|---|---|---|
| Non-Spring backend (Quarkus, Micronaut, plain Java) | ✓ | — |
| Need exact Azure SDK version control | ✓ | — |
| Azure AI Foundry custom deployments with fine-grained config | ✓ | — |
| Spring Boot, want auto-configuration | — | ✓ |
| Multi-provider portability (Azure + Anthropic + Ollama) | — | ✓ |
Setup
Raw Azure SDK:
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-ai-openai</artifactId>
<version>1.0.0-beta.16</version>
</dependency>
Spring AI starter (preferred for Spring Boot):
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-starter-model-azure-openai</artifactId>
</dependency>
spring.ai.azure.openai.endpoint=${AZURE_OPENAI_ENDPOINT}
spring.ai.azure.openai.api-key=${AZURE_OPENAI_API_KEY}
spring.ai.azure.openai.chat.options.deployment-name=gpt-4o
Passwordless Auth with DefaultAzureCredential
The killer feature for Azure-deployed apps: DefaultAzureCredential eliminates hardcoded API keys by automatically using the right credential for each environment:
OpenAIClient client = new OpenAIClientBuilder()
.endpoint("https://my-resource.openai.azure.com/")
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
- Local dev: uses your
az logincredentials automatically - Azure App Service / AKS / Container Apps: uses the app's managed identity — no secrets, no rotation, no leaks
This pattern works identically for Azure PostgreSQL + pgvector, Azure AI Search, and Azure Cosmos DB — the entire stack goes passwordless with one credential type.
Azure AI Model Inference
The newer azure-ai-inference SDK (1.0.0-beta.4) provides access to models beyond OpenAI via Azure AI Model Inference: Meta Llama, Mistral AI, DeepSeek, and others hosted in Azure AI Foundry. Spring AI supports these through the same starter with a different configuration profile.
Foundry Local
For local development, Microsoft's Foundry Local tool runs Azure-compatible models on your laptop. Spring AI can connect to it without code changes — swap the endpoint, keep the code.
Key Characteristics
| Property | Value |
|---|---|
azure-ai-openai status |
1.0.0-beta.16 (preview) |
azure-ai-inference status |
1.0.0-beta.4 (preview) |
| Auth | API key or DefaultAzureCredential (managed identity) |
| Models | Azure OpenAI + Azure AI Model Inference (Llama, Mistral, DeepSeek) |
| Spring AI support | spring-ai-starter-model-azure-openai |