Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langdock.com/llms.txt

Use this file to discover all available pages before exploring further.

Langdock requires three types of models to cover all platform features. Add them in Workspace Settings -> Models. For setup instructions per model type, see Adding your own models.

Set up your models in Langdock

Completion models

These are the models your users select in chat. Add the models you want to make available from the providers you have keys for. We support models hosted by Microsoft Azure, AWS Bedrock, Google Vertex AI, Google AI Studio, OpenAI, Anthropic, Mistral, DeepSeek, Perplexity, Black Forest Labs, Replicate, and any OpenAI-compatible endpoint. For quotas, between 200k and 500k TPM (tokens per minute) covers around 200 users. For your most-used model, you may need 500k–1M TPM.
We recommend setting up multiple deployments in different regions. If a model fails in one region, Langdock automatically retries in another.

Image generation models

Add at least one image generation model so users can generate images from chat. See the Image tab in the adding models guide for setup instructions.

Embedding model

Embedding models power document search and Knowledge Folders. The platform requires a model with 1536 dimensions. Any OpenAI or Azure-compatible embedding model with 1536 dimensions will work.

Deep Research models

Deep Research requires three dedicated models. Go to Settings > Products > Deep Research and assign one model to each role:
RoleWhat it doesRecommended
Reasoning ModelPlans the research, evaluates results, and generates the final report.o3
Fast Reasoning ModelGenerates per-task summaries during the research loop.o4 Mini
Backbone ModelHandles loop decisions and polishes the final report.GPT-5 Mini
You can set a usage limit in the Deep Research settings (default: 15 queries per user per 30 days).
Deep Research uses the reasoning and verbosity settings configured on each model. If these are wrong or missing, Deep Research will produce poor results or fail entirely.
Checklist before continuing:
  • Completion models from the providers you want to offer
  • 1 or more image generation models
  • 1x Embedding model (1536 dimensions)
  • 3x Deep Research models configured with the correct settings above

2. Reach out to the Langdock team

Once your models are set up, contact the Langdock team, and we will activate BYOK for your workspace on our side.

3. Test your models

Once BYOK is active, verify each model type works:
  • Completion models: send a prompt to each model in the interface (e.g. “write a story about dogs”).
  • Image model: ask any model to generate an image.
  • Embedding model: create a folder in the Library, upload a file, then @mention the folder in a chat and ask a question about the file. You should get an answer based on the file content.
  • Deep Research: run a Deep Research query and check that the report is detailed and well-structured. If it’s shallow, recheck the model configuration settings above.
Contact support@langdock.com if anything isn’t working.