Skip to main content

Overview

When writing custom code for integrations, actions, triggers, or workflow code nodes, you have access to a set of built-in utility functions. These functions run in a secure sandboxed JavaScript environment that provides essential capabilities without requiring external libraries.

What is the Sandbox?

The sandbox is a secure, isolated JavaScript execution environment that:
  • Runs untrusted code safely - Memory-limited and timeout-enforced execution
  • Provides essential utilities - HTTP requests, data conversions, cryptography
  • Prevents security risks - No file system access, no dangerous globals like eval or process
  • Requires no dependencies - No npm packages or external imports needed
Custom integration code runs in a secure sandboxed environment. You cannot install or import external libraries (npm, pip, etc.) - only the built-in JavaScript/Node.js APIs documented here are available. For advanced processing (e.g., PDF parsing, image manipulation), use external APIs or services and call them from your integration code.

Where These Utilities Are Available

The sandbox utilities are available in:
  • Custom integration actions - Code that interacts with external APIs
  • Custom integration triggers - Code that monitors for events
  • Authentication flows - OAuth and API key validation code
  • Workflow code nodes - Custom JavaScript in workflow automations

HTTP & Networking

ld.request()

Make HTTP requests to external APIs with automatic JSON handling and error management. Parameters:
{
  method: string;           // HTTP method: 'GET', 'POST', 'PUT', 'PATCH', 'DELETE'
  url: string;              // Full URL to request
  headers?: object;         // Request headers
  params?: object;          // URL query parameters
  body?: object | string;   // Request body (auto-stringified if object)
  timeout?: number;         // Request timeout in milliseconds
  responseType?: string;    // 'stream' or 'binary' for file downloads
}
Returns:
{
  status: number;           // HTTP status code
  headers: object;          // Response headers
  json: any;                // Response body parsed as JSON
  text: string;             // Response body as text
  buffer: Buffer;           // Response body as buffer (for binary data)
}
Example: GET Request
const options = {
  method: "GET",
  url: "https://api.example.com/users/123",
  headers: {
    Authorization: `Bearer ${data.auth.access_token}`,
    Accept: "application/json",
  },
};

const response = await ld.request(options);
return response.json;
Example: POST Request with Body
const options = {
  method: "POST",
  url: "https://api.example.com/tickets",
  headers: {
    Authorization: `Bearer ${data.auth.api_key}`,
    "Content-Type": "application/json",
  },
  body: {
    title: data.input.title,
    description: data.input.description,
    priority: "high",
  },
};

const response = await ld.request(options);
return {
  ticketId: response.json.id,
  url: response.json.url,
};
Example: File Download
const options = {
  method: "GET",
  url: `https://api.example.com/files/${data.input.fileId}/download`,
  headers: {
    Authorization: `Bearer ${data.auth.access_token}`,
  },
  responseType: "binary", // or 'stream'
};

const response = await ld.request(options);

return {
  files: {
    fileName: "document.pdf",
    mimeType: "application/pdf",
    base64: response.buffer.toString("base64"),
  },
};
Example: Form Data Upload
const formData = new FormData();
formData.append("file", data.input.file.base64, data.input.file.fileName);
formData.append("description", "Uploaded via Langdock");

const options = {
  method: "POST",
  url: "https://api.example.com/upload",
  headers: {
    Authorization: `Bearer ${data.auth.access_token}`,
  },
  body: formData,
};

const response = await ld.request(options);
return response.json;
The body parameter is automatically stringified if you pass an object. For application/x-www-form-urlencoded content type, the body is automatically converted to the appropriate format.

ld.awsRequest()

Make AWS SigV4-signed requests to AWS services like S3, API Gateway, or custom AWS APIs. Parameters:
{
  method: string;           // HTTP method
  url: string;              // AWS service URL
  headers?: object;         // Additional headers
  body?: object | string;   // Request body
  region: string;           // AWS region (e.g., 'us-east-1')
  service: string;          // AWS service name (e.g., 's3', 'execute-api')
  credentials: {            // AWS credentials
    accessKeyId: string;      // AWS access key ID
    secretAccessKey: string;  // AWS secret access key
    sessionToken?: string;    // AWS session token (for temporary credentials)
  }
}
Example: S3 File Upload
const options = {
  method: "PUT",
  url: `https://my-bucket.s3.us-east-1.amazonaws.com/${data.input.fileName}`,
  headers: {
    "Content-Type": data.input.mimeType,
  },
  body: Buffer.from(data.input.file.base64, "base64"),
  region: "us-east-1",
  service: "s3",
  credentials: {
    accessKeyId: data.auth.aws_access_key_id,
    secretAccessKey: data.auth.aws_secret_access_key,
  },
};

const response = await ld.awsRequest(options);
return {
  success: true,
  url: options.url,
};

Data Format Conversions

ld.csv2parquet()

Convert CSV text to Parquet format with optional compression and array support. Parameters:
{
  csvText: string;          // CSV data as text
  compression?: string;     // 'gzip', 'snappy', 'brotli', 'lz4', 'zstd' (default), or 'uncompressed'
}
Returns: { base64: string, success: boolean } Example:
const csvText = `name,age,skills
Alice,30,"[Python,JavaScript]"
Bob,25,"[Java,Go]"`;

const result = await ld.csv2parquet(csvText, {
  compression: "gzip",
});

return {
  files: {
    fileName: "data.parquet",
    mimeType: "application/vnd.apache.parquet",
    base64: result.base64,
  },
};
CSV columns containing array-like strings (e.g., "[Python,JavaScript]") are automatically detected and converted to Parquet List columns.

ld.parquet2csv()

Convert Parquet format to CSV text, handling List columns appropriately. Parameters:
base64Parquet: string;    // Base64-encoded Parquet file
Returns: { base64: string, success: boolean } Example:
const parquetBase64 = data.input.parquetFile.base64;
const result = await ld.parquet2csv(parquetBase64);

return {
  files: {
    fileName: "data.csv",
    mimeType: "text/csv",
    base64: result.base64,
  },
};

ld.arrow2parquet()

Convert Arrow IPC Stream format to Parquet. Parameters:
{
  buffer: Buffer;           // Arrow IPC Stream buffer
  compression?: string;     // Compression type (same as csv2parquet)
}
Returns: Base64-encoded Parquet file Example:
const arrowBuffer = Buffer.from(data.input.arrowFile.base64, "base64");

const parquetBase64 = await ld.arrow2parquet(arrowBuffer, {
  compression: "snappy",
});

return {
  files: {
    fileName: "data.parquet",
    mimeType: "application/vnd.apache.parquet",
    base64: parquetBase64,
  },
};

ld.json2csv()

Convert JSON data to CSV format using the nodejs-polars library. Parameters:
jsonData: Array<object>;  // Array of objects to convert
Returns: CSV text Example:
const users = [
  { name: "Alice", email: "[email protected]", age: 30 },
  { name: "Bob", email: "[email protected]", age: 25 },
  { name: "Charlie", email: "[email protected]", age: 35 },
];

const csvText = await ld.json2csv(users);

return {
  files: {
    fileName: "users.csv",
    mimeType: "text/csv",
    text: csvText,
  },
};

Database & SQL

ld.validateSqlQuery()

Validate SQL query syntax to ensure it’s non-empty and contains a single statement. Parameters:
query: string;            // SQL query to validate
Returns: true if valid, throws error otherwise Example:
const query = data.input.sqlQuery;

try {
  ld.validateSqlQuery(query);

  // Query is valid, proceed with execution
  const response = await ld.request({
    method: "POST",
    url: "https://api.example.com/query",
    headers: {
      Authorization: `Bearer ${data.auth.access_token}`,
    },
    body: { query },
  });

  return response.json;
} catch (error) {
  return {
    error: `Invalid SQL query: ${error.message}`,
  };
}

ld.ensureReadOnlySqlQuery()

Enforce that a SQL query is read-only by checking its execution type. Parameters:
query: string;            // SQL query to validate
Returns: true if read-only, throws error otherwise Example:
const query = data.input.sqlQuery;

try {
  ld.ensureReadOnlySqlQuery(query);

  // Query is read-only, safe to execute
  const response = await ld.request({
    method: "POST",
    url: `${data.auth.database_url}/query`,
    headers: {
      Authorization: `Bearer ${data.auth.access_token}`,
    },
    body: { query },
  });

  return response.json.results;
} catch (error) {
  return {
    error: "Only read-only queries (SELECT) are allowed",
  };
}
This function checks the query execution type to ensure it’s LISTING or INFORMATION only. Queries that modify data (INSERT, UPDATE, DELETE) will be rejected.

Microsoft XMLA (Power BI)

ld.microsoftXMLA.connect()

Establish a connection session to Microsoft Power BI XMLA endpoint. Parameters:
{
  accessToken: string;      // Azure AD access token with Power BI API scope
  datasetName: string;      // Dataset name (not ID)
  groupId?: string;         // Optional workspace/group ID (omit for My Workspace)
}
Returns: Session object with sessionId
The Azure region is automatically detected from the Power BI API responses. You do not need to specify it manually.
Example:
const session = await ld.microsoftXMLA.connect({
  accessToken: data.auth.access_token,
  datasetName: data.input.dataset_name,
  groupId: data.auth.workspace_id, // Optional - omit for My Workspace
});

// Store session ID for subsequent queries
return {
  sessionId: session.sessionId,
  message: "Connected successfully",
};

ld.microsoftXMLA.query()

Execute a DAX or MDX query against an established XMLA session. Function Signature:
ld.microsoftXMLA.query(session, query)
Parameters:
  • session (object): Session object from connect()
  • query (string): DAX or MDX query
Returns: Query results as array of objects Example:
// First, establish connection
const session = await ld.microsoftXMLA.connect({
  region: data.auth.azure_region,
  workspaceId: data.auth.workspace_id,
  datasetName: data.input.dataset_name,
  accessToken: data.auth.access_token,
});

// Execute query
const results = await ld.microsoftXMLA.query(session, `
  EVALUATE
  SUMMARIZE(
    Sales,
    Sales[ProductCategory],
    "TotalSales", SUM(Sales[Amount])
  )
`);

return {
  data: results,
  rowCount: results.length,
};
The XMLA connection uses a 6-step protocol to establish sessions with Power BI datasets. The session is automatically managed for you.

Cryptography

ld.signWithRS256()

Create RSA-SHA256 digital signatures, commonly used for JWT signing and OAuth flows. Function Signature:
ld.signWithRS256(data, privateKey, options)
Parameters:
  • data (string): Data to sign
  • privateKey (string): PEM-formatted RSA private key
  • options (object, optional):
    • encoding (string): Output encoding - 'base64' (default) or 'hex'
Returns: Signature string in specified encoding Example: JWT Signing
const header = {
  alg: "RS256",
  typ: "JWT",
};

const payload = {
  iss: "your-client-id",
  sub: "[email protected]",
  aud: "https://api.example.com",
  exp: Math.floor(Date.now() / 1000) + 3600, // 1 hour
  iat: Math.floor(Date.now() / 1000),
};

const encodedHeader = btoa(JSON.stringify(header));
const encodedPayload = btoa(JSON.stringify(payload));
const signingInput = `${encodedHeader}.${encodedPayload}`;

const signature = ld.signWithRS256(signingInput, data.auth.private_key, {
  encoding: "base64",
});

const jwt = `${signingInput}.${signature}`;

// Use JWT for authentication
const response = await ld.request({
  method: "POST",
  url: "https://api.example.com/auth/token",
  body: {
    grant_type: "urn:ietf:params:oauth:grant-type:jwt-bearer",
    assertion: jwt,
  },
});

return response.json;
The private key must be in PEM format. Invalid keys will throw user-friendly error messages. Never expose private keys in logs or return values.

Utility Functions

ld.log()

Output debugging information to the execution logs, visible below the “Test Action” button. Parameters:
...args: any[];           // Any number of values to log
Example:
ld.log("Starting ticket creation");
ld.log("Input data:", data.input);

const response = await ld.request(options);

ld.log("Response status:", response.status);
ld.log("Ticket created:", response.json);

return response.json;
Use ld.log() liberally during development to debug your integration code. Logs are automatically redacted to hide sensitive authentication values.

atob() / btoa()

Base64 encoding and decoding functions, available globally without imports. atob() - Decode base64 string to binary string btoa() - Encode binary string to base64 Example: Base64 URL Decoding
function base64UrlDecode(base64Url) {
  // Convert base64url to standard base64
  const base64 = base64Url.replace(/-/g, "+").replace(/_/g, "/");
  return atob(base64);
}

const token = data.input.jwt_token;
const [header, payload, signature] = token.split(".");

const decodedPayload = JSON.parse(base64UrlDecode(payload));
ld.log("Token payload:", decodedPayload);

return decodedPayload;
Example: Basic Authentication
const credentials = btoa(`${data.auth.username}:${data.auth.password}`);

const response = await ld.request({
  method: "GET",
  url: "https://api.example.com/data",
  headers: {
    Authorization: `Basic ${credentials}`,
  },
});

return response.json;

Buffer.from()

Create Buffer objects for binary data handling. Example: Binary to Base64
const binaryData = Buffer.from(data.input.text, "utf-8");
const base64Data = binaryData.toString("base64");

return {
  base64: base64Data,
};
Example: Base64 to Binary
const buffer = Buffer.from(data.input.file.base64, "base64");

const response = await ld.request({
  method: "PUT",
  url: "https://api.example.com/upload",
  headers: {
    "Content-Type": data.input.file.mimeType,
  },
  body: buffer,
});

return response.json;

FormData

Create multipart form data for file uploads and complex request bodies. Example: File Upload with Metadata
const formData = new FormData();
formData.append("file", data.input.file.base64, data.input.file.fileName);
formData.append("title", data.input.title);
formData.append("category", data.input.category);
formData.append("tags", JSON.stringify(data.input.tags));

const response = await ld.request({
  method: "POST",
  url: "https://api.example.com/documents",
  headers: {
    Authorization: `Bearer ${data.auth.access_token}`,
  },
  body: formData,
});

return response.json;

Standard JavaScript APIs

The sandbox also provides access to standard JavaScript built-ins: JSON
  • JSON.stringify() - Convert objects to JSON strings
  • JSON.parse() - Parse JSON strings to objects
Date
  • new Date() - Create date objects
  • Date.now() - Get current timestamp
  • All standard Date methods
Math
  • Math.floor(), Math.ceil(), Math.round()
  • Math.random(), Math.max(), Math.min()
  • All standard Math methods
RegExp
  • new RegExp() - Create regular expressions
  • String regex methods: match(), replace(), test()
Array & Object
  • All standard Array methods: map(), filter(), reduce(), etc.
  • All standard Object methods: keys(), values(), entries(), etc.
Example: Data Transformation
const users = data.input.users;

// Filter active users
const activeUsers = users.filter((user) => user.status === "active");

// Transform to required format
const formatted = activeUsers.map((user) => ({
  id: user.id,
  name: `${user.firstName} ${user.lastName}`,
  email: user.email.toLowerCase(),
  joinedDate: new Date(user.createdAt).toISOString().split("T")[0],
}));

// Sort by join date
formatted.sort((a, b) => new Date(b.joinedDate) - new Date(a.joinedDate));

return {
  users: formatted,
  total: formatted.length,
};

Best Practices

Error Handling

Always wrap API calls in try-catch blocks and provide helpful error messages:
try {
  const response = await ld.request(options);
  return response.json;
} catch (error) {
  ld.log("Error details:", error.message, error.stack);
  return {
    error: `Failed to fetch data: ${error.message}`,
  };
}

Input Validation

Validate user inputs before using them:
// Validate required fields
if (!data.input.email || !data.input.name) {
  return {
    error: "Missing required fields: email and name are required",
  };
}

// Validate email format
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (!emailRegex.test(data.input.email)) {
  return {
    error: "Invalid email format",
  };
}

// Validate file type
const allowedTypes = ["application/pdf", "image/jpeg", "image/png"];
if (
  data.input.file &&
  !allowedTypes.includes(data.input.file.mimeType)
) {
  return {
    error: `Unsupported file type. Allowed: ${allowedTypes.join(", ")}`,
  };
}

Performance Tips

Minimize API Calls
// Bad: Multiple sequential calls
const user = await ld.request({ url: "/users/123" });
const posts = await ld.request({ url: `/users/123/posts` });
const comments = await ld.request({ url: `/users/123/comments` });

// Good: Use batch endpoints when available
const response = await ld.request({
  url: "/users/123?include=posts,comments",
});
Use Pagination
const allResults = [];
let page = 1;
const pageSize = 100;

while (true) {
  const response = await ld.request({
    url: "https://api.example.com/data",
    params: {
      page,
      pageSize,
    },
  });

  allResults.push(...response.json.items);

  if (response.json.items.length < pageSize) {
    break; // No more pages
  }

  page++;
}

return allResults;

Security Considerations

Never Hardcode Secrets
// Bad: Hardcoded API key
const apiKey = "sk_live_abc123";

// Good: Use authentication fields
const apiKey = data.auth.api_key;
Sanitize User Input
// Sanitize SQL-like input
const searchTerm = data.input.search.replace(/['"]/g, "");

// Validate URL parameters
const itemId = data.input.itemId.replace(/[^a-zA-Z0-9-_]/g, "");
Handle Rate Limits
try {
  const response = await ld.request(options);
  return response.json;
} catch (error) {
  if (error.message.includes("429") || error.message.includes("rate limit")) {
    return {
      error: "API rate limit exceeded. Please try again in a few minutes.",
    };
  }
  throw error;
}

Common Pitfalls

1. Not Handling Async/Await Properly
// Bad: Not awaiting promises
const response = ld.request(options); // Returns Promise, not response!
return response.json; // undefined

// Good: Always await
const response = await ld.request(options);
return response.json;
2. Accessing Nested Properties Without Checks
// Bad: Can throw if user or address is undefined
const city = response.json.user.address.city;

// Good: Use optional chaining
const city = response.json?.user?.address?.city || "Unknown";
3. Not Parsing JSON Strings
// Bad: Assuming string is already an object
const properties = data.input.properties; // String: '{"name":"value"}'
properties.name; // undefined

// Good: Parse JSON strings
const properties = JSON.parse(data.input.properties);
properties.name; // "value"
4. Modifying Frozen Objects
// Bad: Sandbox freezes global objects
Array.prototype.customMethod = () => {}; // Error!

// Good: Create new objects
const customArray = [...originalArray];

Next Steps