How to Use Mistral with OpenClaw?
Mistral AI offers capable models with strong performance at competitive prices. Their models are popular in the European market and known for good multilingual support.
Get your API key from console.mistral.ai. Mistral offers several models: Mistral Large (flagship, comparable to GPT-4o), Mistral Small (balanced), and Mistral Nemo (open-weight, available via Ollama).
For cloud usage, configure OpenClaw with the Mistral provider and your API key. Pricing is competitive — Mistral Large at $2/1M input, $6/1M output; Mistral Small at $0.2/1M, $0.6/1M.
Mistral Nemo can also run locally via Ollama. Pull it with ollama pull mistral-nemo and configure OpenClaw to use Ollama as the provider. It's a 12B model that fits in 16 GB RAM.
Mistral models excel at: code generation, European language support, structured output, and fast inference. They support function calling for MCP skill integration.
Alternatively, access Mistral models through OpenRouter to avoid managing another API key.
# Cloud API openclaw config set provider mistral openclaw config set model mistral-large-latest openclaw config set apiKey your-mistral-key # Or local via Ollama ollama pull mistral-nemo openclaw config set provider ollama openclaw config set model mistral-nemo