Google Completion Endpoint (v1beta)
This endpoint exposes Google Gemini models that are hosted in Google Vertex AI.It mirrors the structure of the official Vertex generateContent API. To use it, you need to:
1
Get available models
Call
GET /{region}
/v1beta/models/
to retrieve the list of Gemini models.2
Pick a model & action
Choose a model ID and decide between
generateContent
or streamGenerateContent
.3
Send your request
POST to
/{region}
/v1beta/models/{model}
:{action}
with your prompt in contents
.4
Handle the response
Parse the JSON response for normal calls or consume the SSE events for streaming.
eu
or us
)• Optional Server-Sent Event (SSE) streaming with the same event labels used by the Google Python SDK (
message_start
, message_delta
, message_stop
)• A models discovery endpoint
Base URL
Authentication
Send one of the following headers while using the Langdock API Key: All headers are treated identically. Missing or invalid keys return 401 Unauthorized. Authorization header example:1. List available models
GET /{region}/v1beta/models
region
must be eu
or us
.
Successful response
List of objects with the following shape:
- name – Fully-qualified model name (e.g.
models/gemini-2.5-flash
). - displayName – Human-readable name shown in the Langdock UI.
- supportedGenerationMethods – Always
["generateContent", "streamGenerateContent"]
.
2. Generate content
POST /{region}/v1beta/models/{model}:{action}
• model – The model ID as returned by the models endpoint (without the models/
prefix).• action –
generateContent
or streamGenerateContent
depending on whether you want to use streaming or not.
Example path: google/eu/v1beta/models/gemini-2.5-flash:streamGenerateContent
Request body
The request body follows the officialGenerateContentRequest
structure.
Required fields
contents
(Content[], required)Conversation history. Each object has a role (string) and parts array containing objects with text (string).
model
(string, required)The model to use for generation (e.g., “gemini-2.5-pro”, “gemini-2.5-flash”).
Optional fields
generationConfig
(object, optional)Configuration for text generation. Supported fields:
temperature
(number): Controls randomness (0.0-2.0)topP
(number): Nucleus sampling parameter (0.0-1.0)topK
(number): Top-k sampling parametercandidateCount
(number): Number of response candidates to generatemaxOutputTokens
(number): Maximum number of tokens to generatestopSequences
(string[]): Sequences that will stop generationresponseMimeType
(string): MIME type of the responseresponseSchema
(object): Schema for structured output
safetySettings
(SafetySetting[], optional)Array of safety setting objects. Each object contains:
category
(string): The harm category (e.g., “HARM_CATEGORY_HARASSMENT”)threshold
(string): The blocking threshold (e.g., “BLOCK_MEDIUM_AND_ABOVE”)
tools
(Tool[], optional)Array of tool objects for function calling. Each tool contains
functionDeclarations
array with:
name
(string): Function namedescription
(string): Function descriptionparameters
(object): JSON schema defining function parameters
toolConfig
(object, optional)Configuration for function calling. Contains
functionCallingConfig
with:
mode
(string): Function calling mode (“ANY”, “AUTO”, “NONE”)allowedFunctionNames
(string[]): Array of allowed function names
systemInstruction
(string | Content, optional)System instruction to guide the model’s behavior. Can be a string or Content object with role and parts.
If
toolConfig.functionCallingConfig.allowedFunctionNames
is provided, mode
must be ANY
.Minimal example
Streaming
When action isstreamGenerateContent
the endpoint returns an
text/event-stream
with compatible events:
• message_start
– first chunk that contains content•
message_delta
– subsequent chunks•
message_stop
– last chunk (contains finishReason
and usage metadata)
Example message_delta
event:
Using Google-compatible libraries
The endpoint is fully compatible with official Google SDKs including the Vertex AI Node SDK (@google-cloud/vertexai
), Google Generative AI Python library (google-generative-ai
), and the Vercel AI SDK for edge streaming.