Anthropic Messages
Creates a model response given a structured list of input messages using the Anthropic API.
Creates a model response for the given chat conversation. This endpoint follows the Anthropic API specification and the requests are sent to the AWS Bedrock Anthropic endpoint.
To use the API you need an API key. To request access, please contact us at support@langdock.com
All parameters from the Anthropic “Create a message” endpoint are supported according to the Anthropic specifications, with the following exception:
model
: The supported models depend on the region, currently the following models are supported in the EU:claude-3-5-sonnet-20240620
,claude-3-sonnet-20240229
,claude-3-haiku-20240307
and the following models are supported in the US:claude-3-5-sonnet-20240620
,claude-3-sonnet-20240229
,claude-3-haiku-20240307
,claude-3-opus-20240229
.
Rate limits
The rate limit for the Messages endpoint is 500 RPM (requests per minute) and 60.000 TPM (tokens per minute). Rate limits are defined at the workspace level - and not at an API key level. Each model has its own rate limit. If you exceed your rate limit, you will receive a 429 Too Many Requests
response.
Please note that the rate limits are subject to change, refer to this documentation for the most up-to-date information. In case you need a higher rate limit, please contact us at support@langdock.com.
Using Anthropic-compatible libraries
As the request and response format is the same as the Anthropic API, you can use popular libraries like the Anthropic Python library or the Vercel AI SDK to use the Langdock API.
Example using the Anthropic Python library
Example using the Vercel AI SDK in Node.js
Headers
API key as Bearer token. Format "Bearer YOUR_API_KEY"
Path Parameters
The region of the API to use.
eu
, us
Body
Response
The response is of type object
.