Skip to main content
The Agents API is the next generation of our API, designed for better compatibility with modern AI SDKs like the Vercel AI SDK.

Overview

The Agents API represents a significant improvement over the Assistants API, with the main goal of providing native compatibility with industry-standard AI SDKs. The key difference is the removal of custom input/output transformations in favor of standard formats.

Why Migrate?

  • Vercel AI SDK compatibility: Works natively with AI SDK 5’s useChat function
  • Standard formats: Uses industry-standard message formats instead of custom transformations
  • Better streaming: Native support for AI SDK streaming patterns
  • Future-proof: The Assistants API will be deprecated in a future release

Key Differences

Endpoint Changes

Assistants APIAgents APIBreaking?
/assistant/v1/chat/completions/agent/v1/chat/completionsYes - Format changes
/assistant/v1/create/agent/v1/createNo - Only parameter names
/assistant/v1/get/agent/v1/getNo - Only parameter names
/assistant/v1/update/agent/v1/updateNo - Only parameter names
/assistant/v1/models/agent/v1/modelsNo - Identical

Parameter Changes (Non-Breaking)

For create, get, and update endpoints, the only change is parameter naming:
  • assistantIdagentId
  • Request/response structure remains identical
  • All other parameters unchanged

Breaking Changes in /chat/completions

The chat completions endpoint has significant format changes to support Vercel AI SDK compatibility.

Request Format Changes

Old Format (Assistants API)

{
  assistantId: "asst_123",
  messages: [
    {
      role: "user",
      content: "Hello, how are you?",  // Simple string content
      attachmentIds: ["uuid-1234"]
    }
  ],
  stream: true
}

New Format (Agents API)

{
  agentId: "agent_123",  // Parameter name changed
  messages: [
    {
      id: "msg_1",  // New: Message ID required
      role: "user",
      parts: [  // New: Parts array instead of content string
        {
          type: "text",
          text: "Hello, how are you?"
        },
        {
          type: "file",
          url: "attachment://uuid-1234"  // New: Attachment format
        }
      ]
    }
  ],
  stream: true
}

Key Request Differences

  1. Message Structure:
    • Old: content as string
    • New: parts as array with typed objects
  2. Message ID:
    • Old: Optional or auto-generated
    • New: Required id field for each message
  3. Attachments:
    • Old: attachmentIds array at message level
    • New: File parts with type: "file" in parts array
  4. Parameter Name:
    • Old: assistantId
    • New: agentId

Response Format Changes

Old Format (Assistants API)

{
  result: [
    {
      id: "msg_456",
      role: "assistant",
      content: [
        {
          type: "text",
          text: "I'm doing well, thank you!"
        }
      ]
    }
  ]
}

New Format (Agents API)

{
  id: "msg_456",
  role: "assistant",
  parts: [
    {
      type: "text",
      text: "I'm doing well, thank you!"
    }
  ]
}

Key Response Differences

  1. Top-level Structure:
    • Old: Wrapped in result array
    • New: Direct message object
  2. Content Field:
    • Old: content array
    • New: parts array

Streaming Changes

Old Format (Assistants API)

data: {"type":"message","content":"Hello"}
data: {"type":"message","content":" world"}
data: {"type":"done"}

New Format (Agents API)

Uses Vercel AI SDK streaming format:
0:"text chunk 1"
1:"text chunk 2"
...
The Agents API streams in Vercel AI SDK’s native format, compatible with the useChat hook.

Migration Steps

Step 1: Update Endpoint URLs

// Before
const url = 'https://api.langdock.com/assistant/v1/chat/completions';

// After
const url = 'https://api.langdock.com/agent/v1/chat/completions';

Step 2: Update Parameter Names (Non-Breaking Endpoints)

For create, get, update endpoints:
// Before
{ assistantId: "asst_123" }

// After
{ agentId: "agent_123" }

Step 3: Update Message Format (Breaking - Chat Completions)

Converting Messages

// Before (Assistants API)
const oldMessage = {
  role: "user",
  content: "Analyze this document",
  attachmentIds: ["uuid-1234"]
};

// After (Agents API)
const newMessage = {
  id: generateId(), // You need to generate message IDs
  role: "user",
  parts: [
    {
      type: "text",
      text: "Analyze this document"
    },
    {
      type: "file",
      url: "attachment://uuid-1234"
    }
  ]
};

Using with Vercel AI SDK

The Agents API works natively with Vercel AI SDK’s useChat hook:
import { useChat } from '@ai-sdk/react';

function ChatComponent() {
  const { messages, input, handleSubmit, handleInputChange } = useChat({
    api: 'https://api.langdock.com/agent/v1/chat/completions',
    headers: {
      'Authorization': `Bearer ${API_KEY}`
    },
    body: {
      agentId: 'agent_123'
    }
  });

  // The hook handles all message formatting automatically!
  return (
    <div>
      {messages.map(m => (
        <div key={m.id}>
          {m.role}: {m.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button type="submit">Send</button>
      </form>
    </div>
  );
}

Step 4: Update Response Handling

// Before (Assistants API)
const response = await fetch(assistantUrl, options);
const data = await response.json();
const messages = data.result; // Array of messages

// After (Agents API)
const response = await fetch(agentUrl, options);
const data = await response.json();
const message = data; // Direct message object

Step 5: Update Streaming Code

Before (Custom SSE Parsing)

const response = await fetch(url, options);
const reader = response.body.getReader();
const decoder = new TextDecoder();

while (true) {
  const { done, value } = await reader.read();
  if (done) break;

  const chunk = decoder.decode(value);
  const lines = chunk.split('\n');

  for (const line of lines) {
    if (line.startsWith('data: ')) {
      const data = JSON.parse(line.slice(6));
      if (data.type === 'message') {
        console.log(data.content);
      }
    }
  }
}

After (Vercel AI SDK)

import { streamText } from 'ai';

const result = await streamText({
  model: langdock({
    apiKey: process.env.LANGDOCK_API_KEY,
    agentId: 'agent_123'
  }),
  messages: conversationHistory
});

// Stream text chunks
for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

Code Examples

Complete Migration Example

Before (Assistants API)

const axios = require("axios");

async function chatWithAssistant() {
  const response = await axios.post(
    "https://api.langdock.com/assistant/v1/chat/completions",
    {
      assistantId: "asst_123",
      messages: [
        {
          role: "user",
          content: "What's the weather today?",
          attachmentIds: []
        }
      ],
      stream: false
    },
    {
      headers: {
        Authorization: "Bearer YOUR_API_KEY"
      }
    }
  );

  // Response wrapped in result array
  const assistantMessage = response.data.result[0];
  console.log(assistantMessage.content[0].text);
}

After (Agents API)

const axios = require("axios");

async function chatWithAgent() {
  const response = await axios.post(
    "https://api.langdock.com/agent/v1/chat/completions",
    {
      agentId: "agent_123",  // Changed parameter name
      messages: [
        {
          id: "msg_1",  // Added message ID
          role: "user",
          parts: [  // Changed to parts array
            {
              type: "text",
              text: "What's the weather today?"
            }
          ]
        }
      ],
      stream: false
    },
    {
      headers: {
        Authorization: "Bearer YOUR_API_KEY"
      }
    }
  );

  // Response is direct message object
  const agentMessage = response.data;
  console.log(agentMessage.parts[0].text);
}

Using with Next.js and Vercel AI SDK

// app/api/chat/route.ts
import { StreamingTextResponse, LangChainStream } from 'ai';

export async function POST(req: Request) {
  const { messages, agentId } = await req.json();

  const response = await fetch(
    'https://api.langdock.com/agent/v1/chat/completions',
    {
      method: 'POST',
      headers: {
        'Authorization': `Bearer ${process.env.LANGDOCK_API_KEY}`,
        'Content-Type': 'application/json'
      },
      body: JSON.stringify({
        agentId,
        messages,
        stream: true
      })
    }
  );

  // Return streaming response
  return new StreamingTextResponse(response.body);
}

Testing Your Migration

Checklist

  • Update all endpoint URLs from /assistant/v1/* to /agent/v1/*
  • Replace assistantId with agentId in all requests
  • Convert message content strings to parts arrays (for chat completions)
  • Add id field to all messages (for chat completions)
  • Update attachment references to use file parts format
  • Update response handling to work with new format
  • Test streaming with new format (or use Vercel AI SDK)
  • Update error handling for new response structure

Gradual Migration Strategy

You can migrate endpoints gradually:
  1. Start with non-breaking endpoints: Migrate create, get, update, models first (only parameter names change)
  2. Test thoroughly: Ensure these work correctly
  3. Migrate chat completions last: This requires the most code changes
  4. Use feature flags: Toggle between old and new APIs during transition

Common Migration Issues

Issue 1: Missing Message IDs

Problem: Agents API requires message IDs
// Error: Missing id field
{
  role: "user",
  parts: [...]
}
Solution: Generate unique IDs for each message
import { nanoid } from 'nanoid';

{
  id: nanoid(),
  role: "user",
  parts: [...]
}

Issue 2: Attachment Format

Problem: Old attachment format not recognized
// Wrong
{
  role: "user",
  attachmentIds: ["uuid-1234"],
  parts: [...]
}
Solution: Use file parts
// Correct
{
  id: "msg_1",
  role: "user",
  parts: [
    { type: "text", text: "..." },
    { type: "file", url: "attachment://uuid-1234" }
  ]
}

Issue 3: Response Parsing

Problem: Looking for result array
// Wrong - result doesn't exist in Agents API
const messages = response.data.result;
Solution: Use direct message object
// Correct
const message = response.data;
const text = message.parts.find(p => p.type === 'text')?.text;

Support

If you encounter issues during migration:
  1. Check the Agents API documentation for detailed examples
  2. Review the Vercel AI SDK documentation for SDK-specific help
  3. Contact support at [email protected]

Timeline

  • Current: Both APIs are available
  • Future: Assistants API will be deprecated (date TBD)
  • Recommendation: Migrate new projects to Agents API now
For questions or assistance with migration, contact our support team at [email protected].