| title | Realtime Streams |
|---|---|
| sidebarTitle | Streams |
| description | Stream data in realtime from your Trigger.dev tasks to your frontend or backend applications. |
Realtime Streams allow you to pipe streaming data from your Trigger.dev tasks to your frontend or backend applications in real-time. This is perfect for use cases like streaming AI completions, progress updates, or any continuous data flow.
Streams v2 requires SDK version **4.1.0 or later**. Make sure to upgrade your `@trigger.dev/sdk` and `@trigger.dev/react-hooks` packages to use these features. If you're on an earlier version, see the [metadata.stream()](/runs/metadata#stream) documentation.Streams v2 is a major upgrade that provides:
- Unlimited stream length (previously capped at 2000 chunks)
- Unlimited active streams per run (previously 5)
- Improved reliability with automatic resumption on connection loss
- 28-day stream retention (previously 1 day)
- Multiple client streams can pipe to a single stream
- Enhanced dashboard visibility for viewing stream data in real-time
Streams v2 is automatically enabled when triggering runs from the SDK using 4.1.0 or later. If you aren't triggering via the SDK, you'll need to explicitly enable v2 streams via setting the x-trigger-realtime-streams-version=v2 header when triggering the task.
If you'd like to opt-out of the v2 streams, you can see so in one of the following two ways:
import { auth } from "@trigger.dev/sdk";
auth.configure({
future: {
v2RealtimeStreams: false,
},
});Set the TRIGGER_V2_REALTIME_STREAMS=0 environment variable in your backend code (where you trigger tasks).
| Limit | Streams v1 | Streams v2 |
|---|---|---|
| Maximum stream length | 2000 | Unlimited |
| Number of active streams per run | 5 | Unlimited |
| Maximum streams per run | 10 | Unlimited |
| Maximum stream TTL | 1 day | 28 days |
| Maximum stream size | 10MB | 300 MiB |
The recommended workflow for using Realtime Streams v2:
- Define your streams in a shared location using
streams.define() - Use the defined stream in your tasks with
.pipe(),.append(), or.writer() - Read from the stream using
.read()or theuseRealtimeStreamhook in React
This approach gives you full type safety, better code organization, and easier maintenance as your application grows.
The recommended way to work with streams is to define them once with streams.define(). This allows you to specify the chunk type and stream ID in one place, and then reuse that definition throughout your codebase with full type safety.
Define your streams in a shared location (like app/streams.ts or trigger/streams.ts):
import { streams, InferStreamType } from "@trigger.dev/sdk";
// Define a stream with a specific type
export const aiStream = streams.define<string>({
id: "ai-output",
});
// Export the type for use in frontend components
export type AIStreamPart = InferStreamType<typeof aiStream>;You can define streams for any JSON-serializable type:
import { streams, InferStreamType } from "@trigger.dev/sdk";
import { UIMessageChunk } from "ai";
// Stream for AI UI message chunks
export const aiStream = streams.define<UIMessageChunk>({
id: "ai",
});
// Stream for progress updates
export const progressStream = streams.define<{ step: string; percent: number }>({
id: "progress",
});
// Stream for simple text
export const logStream = streams.define<string>({
id: "logs",
});
// Export types
export type AIStreamPart = InferStreamType<typeof aiStream>;
export type ProgressStreamPart = InferStreamType<typeof progressStream>;
export type LogStreamPart = InferStreamType<typeof logStream>;Once defined, you can use all stream methods on your defined stream:
import { task } from "@trigger.dev/sdk";
import { aiStream } from "./streams";
export const streamTask = task({
id: "stream-task",
run: async (payload: { prompt: string }) => {
// Get a stream from an AI service, database, etc.
const stream = await getAIStream(payload.prompt);
// Pipe the stream using your defined stream
const { stream: readableStream, waitUntilComplete } = aiStream.pipe(stream);
// Option A: Iterate over the stream locally
for await (const chunk of readableStream) {
console.log("Received chunk:", chunk);
}
// Option B: Wait for the stream to complete
await waitUntilComplete();
return { message: "Stream completed" };
},
});Use the defined stream's read() method to consume data from anywhere (frontend, backend, or another task):
import { aiStream } from "./streams";
const stream = await aiStream.read(runId);
for await (const chunk of stream) {
console.log(chunk); // chunk is typed as the stream's chunk type
}With options:
const stream = await aiStream.read(runId, {
timeoutInSeconds: 60, // Stop if no data for 60 seconds
startIndex: 10, // Start from the 10th chunk
});Use the defined stream's append() method to add a single chunk:
import { task } from "@trigger.dev/sdk";
import { aiStream, progressStream, logStream } from "./streams";
export const appendTask = task({
id: "append-task",
run: async (payload) => {
// Append to different streams with full type safety
await logStream.append("Processing started");
await progressStream.append({ step: "Initialization", percent: 0 });
// Do some work...
await progressStream.append({ step: "Processing", percent: 50 });
await logStream.append("Step 1 complete");
// Do more work...
await progressStream.append({ step: "Complete", percent: 100 });
await logStream.append("All steps complete");
},
});Use the defined stream's writer() method for more complex stream writing:
import { task } from "@trigger.dev/sdk";
import { logStream } from "./streams";
export const writerTask = task({
id: "writer-task",
run: async (payload) => {
const { waitUntilComplete } = logStream.writer({
execute: ({ write, merge }) => {
// Write individual chunks
write("Chunk 1");
write("Chunk 2");
// Merge another stream
const additionalStream = ReadableStream.from(["Chunk 3", "Chunk 4", "Chunk 5"]);
merge(additionalStream);
},
});
await waitUntilComplete();
},
});Defined streams work seamlessly with the useRealtimeStream hook:
"use client";
import { useRealtimeStream } from "@trigger.dev/react-hooks";
import { aiStream } from "@/app/streams";
export function StreamViewer({ accessToken, runId }: { accessToken: string; runId: string }) {
// Pass the defined stream directly - full type safety!
const { parts, error } = useRealtimeStream(aiStream, runId, {
accessToken,
timeoutInSeconds: 600,
});
if (error) return <div>Error: {error.message}</div>;
if (!parts) return <div>Loading...</div>;
return (
<div>
{parts.map((part, i) => (
<span key={i}>{part}</span>
))}
</div>
);
}If you have a specific reason to avoid defined streams, you can use stream methods directly by specifying the stream key each time.
import { streams, task } from "@trigger.dev/sdk";
export const directStreamTask = task({
id: "direct-stream",
run: async (payload: { prompt: string }) => {
const stream = await getAIStream(payload.prompt);
// Specify the stream key directly
const { stream: readableStream, waitUntilComplete } = streams.pipe("ai-output", stream);
await waitUntilComplete();
},
});import { streams } from "@trigger.dev/sdk";
// Specify the stream key when reading
const stream = await streams.read(runId, "ai-output");
for await (const chunk of stream) {
console.log(chunk);
}import { streams, task } from "@trigger.dev/sdk";
export const directAppendTask = task({
id: "direct-append",
run: async (payload) => {
// Specify the stream key each time
await streams.append("logs", "Processing started");
await streams.append("progress", "50%");
await streams.append("logs", "Complete");
},
});import { streams, task } from "@trigger.dev/sdk";
export const directWriterTask = task({
id: "direct-writer",
run: async (payload) => {
const { waitUntilComplete } = streams.writer("output", {
execute: ({ write, merge }) => {
write("Chunk 1");
write("Chunk 2");
},
});
await waitUntilComplete();
},
});Every run has a "default" stream, allowing you to skip the stream key entirely. This is useful for simple cases where you only need one stream per run.
Using direct methods:
import { streams, task } from "@trigger.dev/sdk";
export const defaultStreamTask = task({
id: "default-stream",
run: async (payload) => {
const stream = getDataStream();
// No stream key needed - uses "default"
const { waitUntilComplete } = streams.pipe(stream);
await waitUntilComplete();
},
});
// Reading from the default stream
const readStream = await streams.read(runId);You can pipe streams to parent, root, or any other run using the target option. This works with both defined streams and direct methods.
import { task } from "@trigger.dev/sdk";
import { logStream } from "./streams";
export const childTask = task({
id: "child-task",
run: async (payload, { ctx }) => {
const stream = getDataStream();
// Pipe to parent run
logStream.pipe(stream, { target: "parent" });
// Pipe to root run
logStream.pipe(stream, { target: "root" });
// Pipe to self (default behavior)
logStream.pipe(stream, { target: "self" });
// Pipe to a specific run ID
logStream.pipe(stream, { target: payload.otherRunId });
},
});import { streams, task } from "@trigger.dev/sdk";
export const childTask = task({
id: "child-task",
run: async (payload, { ctx }) => {
const stream = getDataStream();
// Pipe to parent run
streams.pipe("output", stream, { target: "parent" });
// Pipe to root run
streams.pipe("output", stream, { target: "root" });
// Pipe to a specific run ID
streams.pipe("output", stream, { target: payload.otherRunId });
},
});If you specify a target run ID, you can pipe streams from anywhere (like a Next.js API route):
import { streams } from "@trigger.dev/sdk";
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
export async function POST(req: Request) {
const { messages, runId } = await req.json();
const result = streamText({
model: openai("gpt-4o"),
messages,
});
// Pipe AI stream to a Trigger.dev run
const { stream } = streams.pipe("ai-stream", result.toUIMessageStream(), {
target: runId,
});
return new Response(stream as any, {
headers: { "Content-Type": "text/event-stream" },
});
}Use the useRealtimeStream hook to subscribe to streams in your React components.
"use client";
import { useRealtimeStream } from "@trigger.dev/react-hooks";
import { aiStream } from "@/app/streams";
export function StreamViewer({ accessToken, runId }: { accessToken: string; runId: string }) {
// Pass the defined stream directly for full type safety
const { parts, error } = useRealtimeStream(aiStream, runId, {
accessToken,
timeoutInSeconds: 600,
onData: (chunk) => {
console.log("New chunk:", chunk); // chunk is typed!
},
});
if (error) return <div>Error: {error.message}</div>;
if (!parts) return <div>Loading...</div>;
return (
<div>
{parts.map((part, i) => (
<span key={i}>{part}</span>
))}
</div>
);
}If you prefer not to use defined streams, you can specify the stream key directly:
"use client";
import { useRealtimeStream } from "@trigger.dev/react-hooks";
export function StreamViewer({ accessToken, runId }: { accessToken: string; runId: string }) {
const { parts, error } = useRealtimeStream<string>(runId, "ai-output", {
accessToken,
timeoutInSeconds: 600,
});
if (error) return <div>Error: {error.message}</div>;
if (!parts) return <div>Loading...</div>;
return (
<div>
{parts.map((part, i) => (
<span key={i}>{part}</span>
))}
</div>
);
}// Omit stream key to use the default stream
const { parts, error } = useRealtimeStream<string>(runId, {
accessToken,
});const { parts, error } = useRealtimeStream(streamDef, runId, {
accessToken: "pk_...", // Required: Public access token
baseURL: "https://api.trigger.dev", // Optional: Custom API URL
timeoutInSeconds: 60, // Optional: Timeout (default: 60)
startIndex: 0, // Optional: Start from specific chunk
throttleInMs: 16, // Optional: Throttle updates (default: 16ms)
onData: (chunk) => {}, // Optional: Callback for each chunk
});// app/streams.ts
import { streams, InferStreamType } from "@trigger.dev/sdk";
import { UIMessageChunk } from "ai";
export const aiStream = streams.define<UIMessageChunk>({
id: "ai",
});
export type AIStreamPart = InferStreamType<typeof aiStream>;// trigger/ai-task.ts
import { task } from "@trigger.dev/sdk";
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
import { aiStream } from "@/app/streams";
export const generateAI = task({
id: "generate-ai",
run: async (payload: { prompt: string }) => {
const result = streamText({
model: openai("gpt-4o"),
prompt: payload.prompt,
});
const { waitUntilComplete } = aiStream.pipe(result.toUIMessageStream());
await waitUntilComplete();
return { success: true };
},
});// components/ai-stream.tsx
"use client";
import { useRealtimeStream } from "@trigger.dev/react-hooks";
import { aiStream } from "@/app/streams";
export function AIStream({ accessToken, runId }: { accessToken: string; runId: string }) {
const { parts, error } = useRealtimeStream(aiStream, runId, {
accessToken,
timeoutInSeconds: 300,
});
if (error) return <div>Error: {error.message}</div>;
if (!parts) return <div>Loading...</div>;
return (
<div className="prose">
{parts.map((part, i) => (
<span key={i}>{part}</span>
))}
</div>
);
}If you're using the old metadata.stream() API, here's how to migrate to the recommended v2 approach:
Create a shared streams definition file:
// app/streams.ts or trigger/streams.ts
import { streams, InferStreamType } from "@trigger.dev/sdk";
export const myStream = streams.define<string>({
id: "my-stream",
});
export type MyStreamPart = InferStreamType<typeof myStream>;Replace metadata.stream() with the defined stream's pipe() method:
// Before (v1)
import { metadata, task } from "@trigger.dev/sdk";
export const myTask = task({
id: "my-task",
run: async (payload) => {
const stream = getDataStream();
await metadata.stream("my-stream", stream);
},
});// After (v2 - Recommended)
import { task } from "@trigger.dev/sdk";
import { myStream } from "./streams";
export const myTask = task({
id: "my-task",
run: async (payload) => {
const stream = getDataStream();
// Don't await - returns immediately
const { waitUntilComplete } = myStream.pipe(stream);
// Optionally wait for completion
await waitUntilComplete();
},
});Use the defined stream with useRealtimeStream:
// Before
const { parts, error } = useRealtimeStream<string>(runId, "my-stream", {
accessToken,
});// After
import { myStream } from "@/app/streams";
const { parts, error } = useRealtimeStream(myStream, runId, {
accessToken,
});If you prefer not to use defined streams, you can use direct methods:
import { streams, task } from "@trigger.dev/sdk";
export const myTask = task({
id: "my-task",
run: async (payload) => {
const stream = getDataStream();
const { waitUntilComplete } = streams.pipe("my-stream", stream);
await waitUntilComplete();
},
});Streams v2 includes automatic reliability improvements:
- Automatic resumption: If a connection is lost, both appending and reading will automatically resume from the last successful chunk
- No data loss: Network issues won't cause stream data to be lost
- Idempotent operations: Duplicate chunks are automatically handled
These improvements happen automatically - no code changes needed.
Streams are now visible in the Trigger.dev dashboard, allowing you to:
- View stream data in real-time as it's generated
- Inspect historical stream data for completed runs
- Debug streaming issues with full visibility into chunk delivery