Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langdock.com/llms.txt

Use this file to discover all available pages before exploring further.

Every prompt sent to a model incurs a cost from the model provider. There are two ways to handle that cost in Langdock.

Option 1: AI Models Included

Langdock covers the model costs as part of your per-seat price. You get access to all available models with no setup required. This is the simplest option. Everything is managed by Langdock.

Option 2: Bring Your Own Keys (BYOK)

You connect Langdock to your own API keys from a model provider (e.g. Microsoft Azure, Anthropic, Google). You pay the model provider directly, and pay Langdock only for the platform license. This gives you full control over which models are available, what quotas apply, and what you spend. For exact pricing, see the pricing page.

How BYOK works

When BYOK is enabled, your workspace uses only the models you have configured yourself. This includes completion models, embedding models, and image generation models.
You must configure at least one of each model type.
  • Completion model: powers chat and agents
  • Embedding model: enables document search and Folders
  • Image generation model: allows users to generate images
Missing any of these will leave parts of the platform non-functional.

Usage Export

Because model costs flow through your own API keys, BYOK workspaces get detailed cost and token data in the Usage Export. This gives you visibility into what your team is spending across users and models, so you can track costs, identify high-usage models, and optimize over time. For more information, see the Usage Exports page for CSV downloads from workspace settings, or the Usage Export API for programmatic access.

Setting up BYOK

You can select BYOK when first upgrading from a trial to a paid plan. If you want to switch after your initial subscription, contact the Langdock team. See the Getting started guide for the full setup steps.