Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5,011 changes: 5,011 additions & 0 deletions docs/batch-queue-stress-test-plan.md

Large diffs are not rendered by default.

9 changes: 8 additions & 1 deletion docs/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -270,6 +270,13 @@
"management/tasks/batch-trigger"
]
},
{
"group": "Batches API",
"pages": [
"management/batches/create",
"management/batches/stream-items"
]
},
{
"group": "Runs API",
"pages": [
Expand Down Expand Up @@ -698,4 +705,4 @@
"destination": "/migrating-from-v3"
}
]
}
}
36 changes: 30 additions & 6 deletions docs/limits.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -75,20 +75,44 @@ Additional bundles are available for $10/month per 100 concurrent connections. C

## Task payloads and outputs

| Limit | Details |
| :--------------------- | :-------------------------------------------- |
| Single trigger payload | Must not exceed 3MB |
| Batch trigger payload | The total of all payloads must not exceed 5MB |
| Task outputs | Must not exceed 10MB |
| Limit | Details |
| :--------------------- | :----------------------------------------------------------------- |
| Single trigger payload | Must not exceed 3MB |
| Batch trigger payload | Each item can be up to 3MB (SDK 4.3.1+). Prior: 1MB total combined |
| Task outputs | Must not exceed 10MB |

Payloads and outputs that exceed 512KB will be offloaded to object storage and a presigned URL will be provided to download the data when calling `runs.retrieve`. You don't need to do anything to handle this in your tasks however, as we will transparently upload/download these during operation.

## Batch size

A single batch can have a maximum of 500 items.
A single batch can have a maximum of 1,000 items with SDK 4.3.1+. Prior versions are limited to 500 items.

<SoftLimit />

## Batch trigger rate limits

Batch triggering uses a token bucket algorithm to rate limit the number of runs you can trigger per environment. Each run in a batch consumes one token.

| Pricing tier | Bucket size | Refill rate |
| :----------- | :---------- | :-------------------- |
| Free | 1,200 runs | 100 runs every 10 sec |
| Hobby | 5,000 runs | 500 runs every 5 sec |
| Pro | 5,000 runs | 500 runs every 5 sec |

**How it works**: You can burst up to your bucket size, then tokens refill at the specified rate. For example, a Free user can trigger 1,200 runs immediately, then must wait for tokens to refill (100 runs become available every 10 seconds).

## Batch processing concurrency

The number of batches that can be processed concurrently per environment.

| Pricing tier | Limit |
| :----------- | :-------------------- |
| Free | 1 concurrent batch |
| Hobby | 10 concurrent batches |
| Pro | 10 concurrent batches |

This limits how many batches can have their items actively being processed into runs at the same time.

## Log retention

| Pricing tier | Limit |
Expand Down
5 changes: 5 additions & 0 deletions docs/management/batches/create.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
title: "Create batch"
openapi: "openapi POST /api/v3/batches"
---
Comment on lines +1 to +4
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check the openapi reference format used in other management API files
rg -n "^openapi:" docs/management/ -A 0 -C 0

Repository: triggerdotdev/trigger.dev

Length of output: 2504


Fix openapi reference format inconsistency.

The openapi reference in line 3 uses "openapi POST /api/v3/batches", but this differs from the standard format used across all other management API documentation files, which use "v3-openapi POST ...". Update this line to align with the codebase convention: "v3-openapi POST /api/v3/batches". Note that docs/management/batches/stream-items.mdx has the same inconsistency.

🤖 Prompt for AI Agents
In docs/management/batches/create.mdx around lines 1 to 4, the frontmatter
openapi tag uses "openapi POST /api/v3/batches" which is inconsistent with the
project convention; replace that value with "v3-openapi POST /api/v3/batches" to
match other management API docs, and while here also check and update
docs/management/batches/stream-items.mdx if it contains the same "openapi ..."
line so both files use "v3-openapi POST /api/v3/batches".


5 changes: 5 additions & 0 deletions docs/management/batches/stream-items.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
title: "Stream batch items"
openapi: "openapi POST /api/v3/batches/{batchId}/items"
---

238 changes: 238 additions & 0 deletions docs/openapi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,159 @@ paths:
schema:
$ref: "#/components/schemas/Error"

/api/v3/batches:
post:
operationId: createBatch
externalDocs:
description: Find more info here
url: "https://trigger.dev/docs/triggering"
tags:
- Batches
summary: Create a batch (Phase 1)
description: |
Phase 1 of 2-phase batch API. Creates a batch record and optionally blocks the parent run for batchTriggerAndWait.
After creating a batch, stream items via POST /api/v3/batches/{batchId}/items.
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/CreateBatchRequest"
responses:
"202":
description: Batch successfully created
content:
application/json:
schema:
$ref: "#/components/schemas/CreateBatchResponse"
headers:
x-trigger-jwt-claims:
description: JWT claims for the batch
schema:
type: string
x-trigger-jwt:
description: JWT token for browser clients
schema:
type: string
"400":
description: Invalid request (e.g., runCount <= 0 or exceeds maximum)
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"401":
description: Unauthorized - API key is missing or invalid
"422":
description: Validation error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"429":
description: Rate limit exceeded
headers:
X-RateLimit-Limit:
description: Maximum number of requests allowed
schema:
type: integer
X-RateLimit-Remaining:
description: Number of requests remaining
schema:
type: integer
X-RateLimit-Reset:
description: Unix timestamp when the rate limit resets
schema:
type: integer
Retry-After:
description: Seconds to wait before retrying
schema:
type: integer
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"500":
description: Internal server error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"

/api/v3/batches/{batchId}/items:
post:
operationId: streamBatchItems
externalDocs:
description: Find more info here
url: "https://trigger.dev/docs/triggering"
tags:
- Batches
summary: Stream batch items (Phase 2)
description: |
Phase 2 of 2-phase batch API. Accepts an NDJSON stream of batch items and enqueues them.
Each line in the body should be a valid BatchItemNDJSON object.
The stream is processed with backpressure - items are enqueued as they arrive.
The batch is sealed when the stream completes successfully.
parameters:
- name: batchId
in: path
required: true
description: The batch ID returned from POST /api/v3/batches
schema:
type: string
requestBody:
required: true
content:
application/x-ndjson:
schema:
type: string
description: |
NDJSON (newline-delimited JSON) stream where each line is a BatchItemNDJSON object.
Example:
{"index":0,"task":"my-task","payload":{"key":"value1"}}
{"index":1,"task":"my-task","payload":{"key":"value2"}}
application/ndjson:
schema:
type: string
description: |
NDJSON (newline-delimited JSON) stream where each line is a BatchItemNDJSON object.
responses:
"200":
description: Items successfully processed
content:
application/json:
schema:
$ref: "#/components/schemas/StreamBatchItemsResponse"
"400":
description: Invalid request (e.g., invalid JSON, item exceeds maximum size)
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"401":
description: Unauthorized - API key is missing or invalid
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"415":
description: Unsupported Media Type - Content-Type must be application/x-ndjson or application/ndjson
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"422":
description: Validation error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
"500":
description: Internal server error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"

components:
schemas:
Error:
Expand Down Expand Up @@ -130,6 +283,91 @@ components:
type: object
additionalProperties: true
description: A JSON object that represents the deserialized payload or context.
CreateBatchRequest:
type: object
required:
- runCount
properties:
runCount:
type: integer
minimum: 1
description: Expected number of items in the batch. Must be a positive integer.
parentRunId:
type: string
description: Parent run ID (friendly ID) for batchTriggerAndWait.
resumeParentOnCompletion:
type: boolean
description: Whether to resume parent on completion. Set to true for batchTriggerAndWait.
idempotencyKey:
type: string
description: Idempotency key for the batch. If provided and a batch with this key already exists, the existing batch will be returned.
CreateBatchResponse:
type: object
required:
- id
- runCount
- isCached
properties:
id:
type: string
description: The batch ID (friendly ID). Use this to stream items via POST /api/v3/batches/{batchId}/items.
runCount:
type: integer
description: The expected run count.
isCached:
type: boolean
description: Whether this response came from a cached/idempotent batch.
idempotencyKey:
type: string
description: The idempotency key if provided.
BatchItemNDJSON:
type: object
required:
- index
- task
properties:
index:
type: integer
minimum: 0
description: Zero-based index of this item. Used for idempotency and ordering.
task:
type: string
description: The task identifier to trigger.
payload:
description: The payload for this task run. Can be any JSON value.
options:
type: object
additionalProperties: true
description: Options for this specific item.
StreamBatchItemsResponse:
type: object
required:
- id
- itemsAccepted
- itemsDeduplicated
- sealed
properties:
id:
type: string
description: The batch ID.
itemsAccepted:
type: integer
description: Number of items successfully accepted.
itemsDeduplicated:
type: integer
description: Number of items that were deduplicated (already enqueued).
sealed:
type: boolean
description: |
Whether the batch was sealed and is ready for processing.
If false, the batch needs more items before processing can start.
Clients should check this field and retry with missing items if needed.
enqueuedCount:
type: integer
description: Total items currently enqueued. Only present when sealed=false to help with retries.
expectedCount:
type: integer
description: Expected total item count. Only present when sealed=false to help with retries.
securitySchemes:
BearerAuth:
type: http
Expand Down
4 changes: 3 additions & 1 deletion docs/self-hosting/env/webapp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,9 @@ mode: "wide"
| `TASK_PAYLOAD_MAXIMUM_SIZE` | No | 3145728 (3MB) | Max task payload size. |
| `BATCH_TASK_PAYLOAD_MAXIMUM_SIZE` | No | 1000000 (1MB) | Max batch payload size. |
| `TASK_RUN_METADATA_MAXIMUM_SIZE` | No | 262144 (256KB) | Max metadata size. |
| `MAX_BATCH_V2_TRIGGER_ITEMS` | No | 500 | Max batch size. |
| `MAX_BATCH_V2_TRIGGER_ITEMS` | No | 500 | Max batch size (legacy v2 API). |
| `STREAMING_BATCH_MAX_ITEMS` | No | 1000 | Max items in streaming batch (v3 API, requires SDK 4.3.1+). |
| `STREAMING_BATCH_ITEM_MAXIMUM_SIZE` | No | 3145728 (3MB) | Max size per item in streaming batch. |
| `MAXIMUM_DEV_QUEUE_SIZE` | No | — | Max dev queue size. |
| `MAXIMUM_DEPLOYED_QUEUE_SIZE` | No | — | Max deployed queue size. |
| **OTel limits** | | | |
Expand Down
2 changes: 1 addition & 1 deletion docs/snippets/rate-limit-hit-use-batchtrigger.mdx
Original file line number Diff line number Diff line change
@@ -1 +1 @@
The most common cause of hitting the API rate limit is if youre calling `trigger()` on a task in a loop, instead of doing this use `batchTrigger()` which will trigger multiple tasks in a single API call. You can have up to 500 tasks in a single batch trigger call.
The most common cause of hitting the API rate limit is if you're calling `trigger()` on a task in a loop, instead of doing this use `batchTrigger()` which will trigger multiple tasks in a single API call. You can have up to 1,000 tasks in a single batch trigger call with SDK 4.3.1+ (500 in prior versions).
Loading