Whenever a user submits a prompt and sends it to the model, an answer is generated and sent back to the user. Costs occur for the underlying AI model for this answer generation.

There are two options for paying these costs to the model provider (e.g. Microsoft Azure for GPT models):

Option 1: Flat fee for LLM costs

  • The customer uses Langdock’s API keys from Microsoft, for example.
  • All usage is billed through Langdock.
  • Langdock offers all users in the workspace access to all models at a flat fee.
  • This flat fee currently costs 5€ per user per month.

Option 2: Bring your own keys (BYOK)

  • The customer brings their own API keys from the model provider (for example, Microsoft).
  • For Langdock, only the licensing fee for the platform is paid.
  • All model/usage-related costs are directly between the customer and the model provider.

Option 1 is the “all-inclusive” version of Langdock, where the customer does not have to set up and manage keys on their side (getting keys for the models, requesting quota, keeping models updated,…). Option 2 tends to be a bit cheaper overall.

For transparency: We offer the use of our API keys cheaply as we don’t want to incentivize ourselves to make money with LLM API arbitrage and to keep us focused on building a great application layer on top of LLMs.

To fully use your keys and enable BYOK, please reach out to the Langdock team. If BYOK is not enabled, Langdock still uses the models from Langdock in the background (for embeddings and image generation). Technically, BYOK needs to be manually enabled to switch to your keys.