Technology RadarTechnology Radar

LM Studio

inference
This item was not updated in last three versions of the Radar. Should it have appeared in one of the more recent editions, there is a good chance it remains pertinent. However, if the item dates back further, its relevance may have diminished and our current evaluation could vary. Regrettably, our capacity to consistently revisit items from past Radar editions is limited.
Trial

LM Studio is a desktop application for discovering, downloading, and running large language models locally — think of it as a visual companion to Ollama, optimised for exploration rather than automation.

Buy vs Build

LM Studio is a build tool (runs on your own hardware) with a polished GUI that makes it feel more like a buy experience. Free to use; no subscription required.

Why It's in Trial

LM Studio fills a distinct niche: it makes local model experimentation accessible to engineers who don't want to use a terminal. It's also the best tool for evaluating a model before committing to deploying it with Ollama in an automated pipeline.

Key features:

  • Visual model browser: Search Hugging Face directly from the app, filter by size/capability, read model cards
  • Built-in chat: Test any model conversationally with system prompt editing, parameter controls (temperature, context size)
  • Local API server: Start an OpenAI-compatible server at localhost:1234 — point your existing code at it
  • Multi-model management: Download, manage, and switch between models with a GUI

Typical Usage Pattern

Many developers use both Ollama and LM Studio:

  1. Use LM Studio to explore and compare models visually
  2. Use Ollama for the models you've chosen in your actual development workflow and CI

Hardware Support

  • macOS: Excellent (Apple Silicon with Metal GPU acceleration)
  • Windows: Strong (NVIDIA CUDA + AMD Vulkan)
  • Linux: Beta (improving rapidly)

Compared to Ollama

LM Studio Ollama
Interface GUI desktop app CLI + REST API
Best for Model discovery, exploration Development pipelines, automation
Concurrent requests Single-threaded Queued (multi-client)
Headless servers No Yes
CI/CD integration No Yes

Key Characteristics

Property Value
Platforms macOS, Windows, Linux (beta)
API format OpenAI-compatible
Model source Hugging Face
License Free to use, proprietary
Provider LM Studio
Website lmstudio.ai

Further Reading