Skip to main content
Mistral models can be configured directly in Langdock using the Mistral API.

Prerequisites

Before setting up Mistral models, you need:
  1. A Mistral account at console.mistral.ai
  2. An API key from the Mistral platform
  3. Admin access to your Langdock workspace

Setup Steps

  1. Go to the model settings and click on Add Model
  2. Configure the Display Settings:
    • Provider: Mistral
    • Model name: The model you want to add (e.g., Mistral Large, Codestral)
    • Hosting provider: Mistral (or Azure if using Azure-hosted Mistral)
    • Region: Select the appropriate region
    • Image analysis: Disabled for text-only models, enabled for multimodal variants
  3. Configure the Model Configuration:
    • SDK: Select Mistral
    • Base URL: Leave empty to use the default (https://api.mistral.ai/v1) or specify a custom endpoint
    • Model ID: Use the official model identifier (see Model IDs below)
    • API key: Paste your Mistral API key
    • Context Size: Set according to the model (check Mistral’s documentation for current values)
  4. Click Save and test the model by sending a prompt before making it visible to all users

Model IDs

Mistral uses the -latest suffix to always point to the current version of each model:
Model IDType
mistral-large-latestFlagship model — complex reasoning, multilingual, instruction following
codestral-latestCode-specialized — code generation, completion, and technical tasks
mistral-small-latestFast and cost-effective — good for everyday tasks
Check Mistral’s model documentation for the full list of available models, their specifications, and context sizes.

Using Mistral from Azure

If you’re using Mistral models hosted on Azure (via Azure AI Models-as-a-Service), you still need to select “Mistral” as the SDK in Langdock. The SDK selection refers to the API format, not the hosting provider.
When configuring Azure-hosted Mistral models:
  • Set the Hosting provider to Azure in Display Settings
  • Set the SDK to “Mistral” in Model Configuration
  • Use your Azure endpoint as the Base URL
  • Use your Azure API key

Configuration Notes

  • Mistral models support tool calling natively
  • The default API endpoint https://api.mistral.ai/v1 is used automatically when no custom Base URL is provided
  • Mistral models are known for strong multilingual capabilities, particularly in European languages

Troubleshooting

Model not responding:
  • Verify your API key is valid and has not expired
  • Check that you have sufficient credits in your Mistral account
  • Ensure the model ID matches exactly (case-sensitive)
Authentication errors with Azure:
  • Double-check that you’re using “Mistral” as the SDK, not “Azure OpenAI”
  • Verify your Azure endpoint URL is correct and accessible
  • Ensure your Azure API key has the necessary permissions
Slow responses:
  • Larger models may take longer for complex reasoning tasks
  • Consider using a smaller model for faster responses on simpler tasks
If you run into any issues, contact support@langdock.com.