Skip to content

fix(memory): add Bun.gc, stream cancellation, and unconsumed fetch drains#3416

Merged
waleedlatif1 merged 10 commits intostagingfrom
waleedlatif1/richmond-v1
Mar 5, 2026
Merged

fix(memory): add Bun.gc, stream cancellation, and unconsumed fetch drains#3416
waleedlatif1 merged 10 commits intostagingfrom
waleedlatif1/richmond-v1

Conversation

@waleedlatif1
Copy link
Collaborator

Summary

  • Add periodic Bun.gc(true) to force JSC GC + mimalloc page purge every 60s
  • Fix stream.tee() error paths to cancel streams instead of just releasing locks
  • Fix O(n²) string concatenation in consumeExecutorStream (chunks array + join)
  • Drain unconsumed fetch response bodies in 6 error paths (tts, stt, vision, textract, gmail)
  • Add preloadEntriesOnStart: false to reduce baseline memory

Type of Change

  • Bug fix

Testing

All 4216 tests pass. Verified tsc --noEmit clean.

Checklist

  • Code follows project style guidelines
  • Self-reviewed my changes
  • Tests added/updated and passing
  • No new warnings introduced
  • I confirm that I have read and agree to the terms outlined in the Contributor License Agreement (CLA)

@cursor
Copy link

cursor bot commented Mar 5, 2026

PR Summary

Medium Risk
Touches streaming execution and execution-state cleanup paths; while changes are mostly error-path resource handling, incorrect cancellation/cleanup ordering could affect chat/workflow streaming completion behavior.

Overview
Improves resource cleanup to mitigate memory growth by explicitly draining/cancelling failed fetch responses (STT/Textract/Vision/TTS + Gmail) and cancelling stream readers/processed streams in multiple streaming paths (workflow SSE block streaming and executor stream.tee() handling) instead of just releasing locks.

Optimizes executor stream capture by replacing incremental string concatenation with chunk buffering + join(), adds periodic opportunistic Bun.gc(false) in memory telemetry, and adjusts chat/workflow execution state handling so chat-driven executions defer isExecuting cleanup until the client-side stream wrapper fully finishes.

Also disables Next.js experimental.preloadEntriesOnStart, relaxes ServiceNow offset query param handling, and bumps turbo to 2.8.13.

Written by Cursor Bugbot for commit 72eb892. Configure here.

@vercel
Copy link

vercel bot commented Mar 5, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
docs Ready Ready Preview, Comment Mar 5, 2026 1:40am

Request Review

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 5, 2026

Greptile Summary

This PR addresses a cluster of memory and resource-leak fixes targeting the Bun/Next.js server: it adds periodic opportunistic GC, properly drains six unconsumed fetch response bodies in error paths, fixes O(n²) string concatenation in the executor stream consumer, replaces releaseLock() with cancel() for tee'd stream cleanup, and — most significantly — repairs a race condition in chat executions where isExecuting was being set to false by onExecutionCompleted before the client-side stream was fully consumed, causing the removed useEffect to prematurely cancel the in-progress stream.

Key changes:

  • memory-telemetry.ts: Adds Bun.gc(false) (non-blocking, opportunistic) every 60 s inside the telemetry interval.
  • block-executor.ts: Replaces O(n²) fullContent += with a chunks[] array, adds the missing TextDecoder tail flush, and uses reader.cancel() instead of reader.releaseLock() in finally blocks to release tee'd buffers on error paths.
  • use-workflow-execution.ts: Guards setIsExecuting(false) / setActiveBlocks in onExecutionCompleted, onExecutionError, and onExecutionCancelled with !isExecutingFromChat, deferring cleanup to the ReadableStream start() finally block after all stream data is enqueued.
  • chat.tsx: Removes the useEffect that was the root cause of premature chat-stream cancellation (it fired when isExecuting toggled false, cancelling the reader before all content was delivered).
  • Six error-path fetch drains (tts, stt, vision, textract, gmail-polling-service, gmail/utils): Prevents HTTP connection pool exhaustion by consuming or aborting response bodies before throwing.
  • next.config.ts: Adds preloadEntriesOnStart: false to reduce startup memory.

Remaining concern:

  • processStreamingResponse in chat.tsx does not call reader.cancel() in its finally block. With the useEffect removed, there is no longer a fallback path to release the reader lock on non-AbortError exceptions — the lock is held until the objects are garbage collected.

Confidence Score: 4/5

  • Safe to merge — the core logic is correct and well-tested; one minor resource-cleanup gap in processStreamingResponse remains.
  • The race-condition fix in chat execution state management is logically sound: the stream's start() finally block reliably calls setIsExecuting(false) in all exit paths (normal, error, cancel). All tee'd stream buffers are properly released on error paths. The one non-critical issue is the missing reader.cancel() in processStreamingResponse's finally block, which was previously covered by the now-removed useEffect. This is a low-severity resource-management gap (objects are GC'd eventually) but worth addressing given the PR's memory-focused goals.
  • apps/sim/app/workspace/[workspaceId]/w/[workflowId]/components/chat/chat.tsx — processStreamingResponse finally block should release the reader lock explicitly.

Important Files Changed

Filename Overview
apps/sim/executor/execution/block-executor.ts Fixes O(n²) string concat in consumeExecutorStream (chunks array + join), adds TextDecoder tail flush, replaces releaseLock() with cancel() in finally blocks, and adds processedStream.cancel() in error paths to release tee'd buffers.
apps/sim/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-workflow-execution.ts Adds isExecutingFromChat guards to defer setIsExecuting(false) from SSE callbacks to the stream's start() finally block, preventing premature stream cancellation for chat executions. Logic is sound but there is a small window where active blocks remain set for chat executions.
apps/sim/app/workspace/[workspaceId]/w/[workflowId]/components/chat/chat.tsx Removes the useEffect that prematurely cancelled chat streams when isExecuting became false. The finally block in processStreamingResponse does not release the reader lock on non-abort exceptions, which previously the removed useEffect handled as a side-effect.
apps/sim/lib/monitoring/memory-telemetry.ts Adds opportunistic Bun.gc(false) call (non-blocking) inside the telemetry interval. Correctly uses the non-blocking form to avoid event loop stalls.
apps/sim/app/api/workflows/[id]/execute/route.ts Replaces reader.releaseLock() with await reader.cancel().catch(() => {}) in the SSE streaming finally block, consistent with block-executor.ts pattern.

Sequence Diagram

sequenceDiagram
    participant Chat as chat.tsx
    participant Hook as use-workflow-execution.ts
    participant Stream as ReadableStream start()
    participant Exec as executeWorkflow (SSE)
    participant Server as /api/workflows/execute

    Chat->>Hook: handleRunWorkflow(chatInput)
    Hook->>Stream: new ReadableStream({ start() })
    Hook-->>Chat: { success: true, stream }
    Chat->>Chat: processStreamingResponse(stream)

    Stream->>Exec: executeWorkflow('chat')
    Exec->>Server: SSE connection
    Server-->>Exec: stream:chunk events
    Exec-->>Stream: safeEnqueue(chunk) via onStream
    Stream-->>Chat: reader.read() → chunk

    Server-->>Exec: execution:completed event
    Exec->>Hook: onExecutionCompleted (isExecutingFromChat=true → skip setIsExecuting)
    Note over Hook: setIsExecuting NOT called here

    Stream->>Stream: await Promise.all(streamReadingPromises)
    Stream->>Stream: safeEnqueue(final event)
    Stream->>Stream: controller.close()
    Stream->>Hook: finally: setIsExecuting(false)

    Chat->>Chat: reader.read() → done=true
    Chat->>Chat: finalizeMessageStream()
    Note over Chat: isStreaming=false
Loading

Last reviewed commit: 72eb892

Comments Outside Diff (1)

  1. undefined, line undefined (link)

    Stream reader lock not released in finally block

    The finally block clears streamReaderRef.current but never calls reader.cancel() or reader.releaseLock(). Before this PR, the now-removed useEffect incidentally released the lock by calling streamReaderRef.current?.cancel() when execution ended. Without that fallback, if processStreamingResponse exits via a non-AbortError exception, the reader holds the tee'd stream's lock until the component unmounts or JavaScript GC collects both objects — preventing prompt buffer reclamation.

    Consider adding an explicit reader.cancel().catch(() => {}) in the finally block to always release the lock immediately:

@waleedlatif1
Copy link
Collaborator Author

@greptile

@waleedlatif1
Copy link
Collaborator Author

@cursor review

@waleedlatif1
Copy link
Collaborator Author

@greptile

@waleedlatif1
Copy link
Collaborator Author

@cursor review

@waleedlatif1
Copy link
Collaborator Author

@greptile

@waleedlatif1
Copy link
Collaborator Author

@cursor review

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Bugbot reviewed your changes and found no new issues!

Comment @cursor review or bugbot run to trigger another review on this PR

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Bugbot reviewed your changes and found no new issues!

Comment @cursor review or bugbot run to trigger another review on this PR

@waleedlatif1 waleedlatif1 force-pushed the waleedlatif1/richmond-v1 branch from dbf94d1 to ca16db0 Compare March 5, 2026 01:16
The onExecutionCompleted/Error/Cancelled callbacks were setting
isExecuting=false as soon as the server-side SSE stream completed.
For chat executions, this triggered a useEffect in chat.tsx that
cancelled the client-side stream reader before it finished consuming
buffered data — causing empty or partial chat responses.

Skip the isExecuting=false in these callbacks for chat executions
since the chat's own finally block handles cleanup after the stream
is fully consumed.
…tate change

The effect reacted to isExecuting becoming false to clean up streams,
but this is an anti-pattern per React guidelines — using state changes
as a proxy for events. All cleanup cases are already handled by proper
event paths: stream done (processStreamingResponse), user cancel
(handleStopStreaming), component unmount (cleanup effect), and
abort/error (catch block).
@waleedlatif1
Copy link
Collaborator Author

@greptile

@waleedlatif1
Copy link
Collaborator Author

@cursor review

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Bugbot reviewed your changes and found no new issues!

Comment @cursor review or bugbot run to trigger another review on this PR

@waleedlatif1 waleedlatif1 merged commit fcdcaed into staging Mar 5, 2026
12 checks passed
@waleedlatif1 waleedlatif1 deleted the waleedlatif1/richmond-v1 branch March 5, 2026 01:46
Comment on lines +1501 to +1504
if (!isExecutingFromChat) {
setIsExecuting(activeWorkflowId, false)
setActiveBlocks(activeWorkflowId, new Set())
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

setActiveBlocks still not cleared for chat executions on onExecutionCompleted

When isExecutingFromChat is true, setActiveBlocks(activeWorkflowId, new Set()) is skipped here and is deferred to the stream start() finally block (line 1042). However, there is a window between when onExecutionCompleted fires and when start() reaches its finally block (e.g., while await Promise.all(streamReadingPromises) is still running). During that window, the active blocks set remains non-empty in the execution store, which may cause the UI to continue showing block highlights after the server-side execution is done.

If clearing active blocks earlier is acceptable for the chat case, consider calling setActiveBlocks unconditionally here just like setExecutionResult is, since block highlighting doesn't depend on the client stream being fully consumed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant