-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
Summary
When appending object payloads to a realtime stream with stream.append(...), useRealtimeStream(...) receives chunks as "[object Object]" (or otherwise non-typed), and the Trigger dashboard also shows "[object Object]".
This breaks typed stream consumption and can cause UI progress/completion logic to miss expected events.
Versions
@trigger.dev/sdk:4.4.1@trigger.dev/react-hooks:4.4.1- Observed in Trigger Cloud + React frontend consumer
Minimal Repro
1) Define a typed object stream
import { streams, task } from "@trigger.dev/sdk";
const eventStream = streams.define<{ type: string; step: string }>({
id: "repro-object-stream",
});
export const reproTask = task({
id: "repro-task",
run: async () => {
await eventStream.append({ type: "stage_started", step: "one" });
await eventStream.append({ type: "stage_completed", step: "done" });
return { ok: true };
},
});2) Subscribe in React
import { useRealtimeStream } from "@trigger.dev/react-hooks";
const { parts } = useRealtimeStream(eventStream, runId, { accessToken });
console.log(parts);Observed
- Trigger dashboard stream entries show
[object Object]. - Frontend receives string chunks like
[object Object](instead of typed objects). - Consumers relying on object shape (
part.type) fail.
Expected
- Object payloads appended via
append(...)should round-trip as objects touseRealtimeStream. - Dashboard should display object payloads meaningfully (or at least preserve JSON shape).
Suspected Cause
In installed @trigger.dev/core@4.4.1, appendToStream appears to send raw body: part (without JSON serialization), which can coerce objects to [object Object] through fetch body handling.
Temporary Workaround
Serializing manually before append (e.g. JSON.stringify(...)) and normalizing on read avoids the issue, but this defeats the typed object stream ergonomics.
Extra Context
I can provide a full repo repro if useful, but the minimal snippet above should reproduce.