Whenever you submit a prompt and send it to the model, an answer is generated and sent back to you. Costs occur for the underlying AI model for this answer generation.

Options for LLM costs

There are two options for paying these costs to the model provider (e.g., Microsoft Azure for GPT models):

Option 1: Flat fee for LLM costs

  • You use Langdock’s API keys from Microsoft, for example
  • All usage is billed through Langdock
  • Langdock offers all users in the workspace access to all models at a flat fee
  • This flat fee currently costs €5 per user per month

Option 2: Bring your own keys (BYOK)

  • You bring your own API keys from the model provider (for example, Microsoft)
  • For Langdock, only the licensing fee for the platform is paid
  • All model/usage-related costs are directly between you and the model provider
Option 1 is the “all-inclusive” version of Langdock, where you don’t have to set up and manage keys on your side (getting keys for the models, requesting quota, keeping models updated, etc.). Option 2 tends to be a bit cheaper overall. Note: We offer the use of our API keys cheaply as we don’t want to incentivize ourselves to make money with LLM API arbitrage and to keep us focused on building a great application layer on top of LLMs.

Set up of BYOK

To use BYOK and not pay the flat fee for LLM costs, BYOK needs to be manually activated for your workspace by the Langdock team. Otherwise, your workspace still uses the models from Langdock in the background (for embeddings and image generation). Here is a guide of how to set up BYOK.