Prerequisites
Before setting up Mistral models, you need:- A Mistral account at console.mistral.ai
- An API key from the Mistral platform
- Admin access to your Langdock workspace
Setup Steps
- Go to the model settings and click on Add Model
-
Configure the Display Settings:
- Provider: Mistral
- Model name: The model you want to add (e.g., Mistral Large, Codestral)
- Hosting provider: Mistral (or Azure if using Azure-hosted Mistral)
- Region: Select the appropriate region
- Image analysis: Disabled for text-only models, enabled for multimodal variants
-
Configure the Model Configuration:
- SDK: Select Mistral
- Base URL: Leave empty to use the default (
https://api.mistral.ai/v1) or specify a custom endpoint - Model ID: Use the official model identifier (see Model IDs below)
- API key: Paste your Mistral API key
- Context Size: Set according to the model (check Mistral’s documentation for current values)
- Click Save and test the model by sending a prompt before making it visible to all users
Model IDs
Mistral uses the-latest suffix to always point to the current version of each model:
| Model ID | Type |
|---|---|
mistral-large-latest | Flagship model — complex reasoning, multilingual, instruction following |
codestral-latest | Code-specialized — code generation, completion, and technical tasks |
mistral-small-latest | Fast and cost-effective — good for everyday tasks |
Check Mistral’s model documentation for the full list of available models, their specifications, and context sizes.
Using Mistral from Azure
If you’re using Mistral models hosted on Azure (via Azure AI Models-as-a-Service), you still need to select “Mistral” as the SDK in Langdock. The SDK selection refers to the API format, not the hosting provider.Configuration Notes
- Mistral models support tool calling natively
- The default API endpoint
https://api.mistral.ai/v1is used automatically when no custom Base URL is provided - Mistral models are known for strong multilingual capabilities, particularly in European languages
Troubleshooting
Model not responding:- Verify your API key is valid and has not expired
- Check that you have sufficient credits in your Mistral account
- Ensure the model ID matches exactly (case-sensitive)
- Double-check that you’re using “Mistral” as the SDK, not “Azure OpenAI”
- Verify your Azure endpoint URL is correct and accessible
- Ensure your Azure API key has the necessary permissions
- Larger models may take longer for complex reasoning tasks
- Consider using a smaller model for faster responses on simpler tasks