Whenever you submit a prompt and send it to a model, an answer is generated and sent back to you. Costs occur for the underlying AI model for this answer generation.
Options for LLM costs
There are two options for paying these costs to the model provider (e.g., Microsoft Azure for GPT models):
Option 1: AI Models Included
- You use Langdock’s API keys — no setup needed on your side
- All model usage is billed through Langdock as part of your per-seat price
- Langdock offers all users in the workspace access to all models
- Higher per-seat price, but simple and predictable billing
Option 2: Bring Your Own Keys (BYOK)
- You bring your own API keys from the model provider (for example, Microsoft Azure)
- Lower per-seat price — you only pay Langdock for the platform license
- All model/usage-related costs are directly between you and the model provider
Option 1 is the “all-inclusive” version of Langdock, where you don’t have to set up and manage keys on your side (getting keys for the models, requesting quota, keeping models updated, etc.). Option 2 gives you full control over model costs and access to models not available through Langdock’s included pool.
For exact pricing, see the pricing page or the pricing docs.
How BYOK works
When BYOK is enabled, your workspace only uses models you have configured yourself — this includes completion models, embedding models, image generation models, and backbone models (used for summarization, title generation, etc.).
You must configure your own embedding and backbone models when using BYOK. Without a backbone model, features like conversation titles and summarization won’t work correctly.
Setting up BYOK
You can select BYOK when first upgrading from a trial to a paid plan — there’s a checkbox to include or exclude AI models. If you want to switch to BYOK after your initial subscription, contact the Langdock team to have it activated for your workspace.
Here is a step-by-step guide for setting up BYOK.