How to Connect LM Studio to OpenClaw?
LM Studio is a desktop application for running local AI models with a graphical interface. It's an alternative to Ollama that's easier for beginners — download a model with one click and it provides an OpenAI-compatible API endpoint.
Download LM Studio from lmstudio.ai (available for macOS, Windows, Linux). Browse the model library, download a model (GGUF format), and load it. LM Studio starts a local server that OpenClaw can connect to.
By default, LM Studio's server runs on localhost:1234. Configure OpenClaw to use this as a custom OpenAI-compatible endpoint.
LM Studio advantages over Ollama: visual model management, real-time performance metrics, easy parameter tuning (temperature, top-p), and a built-in chat for testing before connecting to OpenClaw.
Disadvantages: slightly less automation-friendly (GUI-focused), no daemon mode on Linux, and some advanced features require the paid version.
For most users who prefer a GUI, LM Studio is the easier path to local models. For headless servers and automation, Ollama is recommended.
# Configure OpenClaw for LM Studio openclaw config set provider openai-compatible openclaw config set baseUrl http://localhost:1234/v1 openclaw config set model local-model openclaw config set apiKey lm-studio