Skip to content

test(openai-agents): Replace mocks with httpx types for streamed responses#5580

Open
alexander-alderman-webb wants to merge 7 commits intomasterfrom
webb/remove-streamed-mocks
Open

test(openai-agents): Replace mocks with httpx types for streamed responses#5580
alexander-alderman-webb wants to merge 7 commits intomasterfrom
webb/remove-streamed-mocks

Conversation

@alexander-alderman-webb
Copy link
Contributor

@alexander-alderman-webb alexander-alderman-webb commented Mar 3, 2026

Description

Replace mocks with openai types to avoid test failures when library internals change.

Move async_iterator() to conftest.py

Issues

Reminders

@alexander-alderman-webb alexander-alderman-webb requested a review from a team as a code owner March 3, 2026 10:08
@github-actions
Copy link
Contributor

github-actions bot commented Mar 3, 2026

Codecov Results 📊

11 passed | Total: 11 | Pass Rate: 100% | Execution Time: 1m 41s

All tests are passing successfully.

✅ Patch coverage is 100.00%. Project has 15081 uncovered lines.


Generated by Codecov Action

@github-actions
Copy link
Contributor

github-actions bot commented Mar 3, 2026

Semver Impact of This PR

🟢 Patch (bug fixes)

📋 Changelog Preview

This is how your changes will appear in the changelog.
Entries from this PR are highlighted with a left border (blockquote style).


Bug Fixes 🐛

  • (celery) Propagate user-set headers by sentrivana in #5581
  • (utils) Avoid double serialization of strings in safe_serialize by ericapisani in #5587

Documentation 📚

  • (openai-agents) Remove inapplicable comment by alexander-alderman-webb in #5495
  • Add AGENTS.md by sentrivana in #5579
  • Add set_attribute example to changelog by sentrivana in #5578

Internal Changes 🔧

Openai Agents

  • Replace mocks with httpx types for streamed responses by alexander-alderman-webb in #5580
  • Expect namespace tool field for new openai versions by alexander-alderman-webb in #5599

🤖 This preview updates automatically when you update the PR.

]
)
),
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Async generator passed to wrong httpx.Response parameter

High Severity

The httpx.Response constructor's content parameter expects bytes, but the code passes an async generator (async_iterator(sse_chunks(...))) to it. For streaming content, the correct parameter is stream, which accepts an AsyncByteStream (async generators satisfy this via __aiter__). The openai-python test pattern that this PR claims to follow actually uses stream=MockStream(body), not content=. Passing a non-bytes object to content will cause the tests to fail — either immediately at construction time with a TypeError, or at read time when httpx tries to treat the async generator as bytes.

Additional Locations (1)

Fix in Cursor Fix in Web

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

duplicate of #5580 (comment)

@alexander-alderman-webb alexander-alderman-webb changed the title test(openai-agents): Replace mocks with library types for streamed responses test(openai-agents): Replace mocks with httpx types for streamed responses Mar 3, 2026
Copy link
Member

@ericapisani ericapisani left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One nit, otherwise LGTM

assert ai_client_span2["data"]["gen_ai.usage.total_tokens"] == 25


def sse_chunks(events):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: I'm assuming sse = "server-side events"? For easier readability, could we expand the name of this function?

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Autofix Details

Bugbot Autofix prepared a fix for the issue found in the latest run.

  • ✅ Fixed: Unused imports added to test file
    • Removed the four unused imports from the test module to eliminate dead dependencies and keep imports accurate.

Create PR

Or push these changes by commenting:

@cursor push 8b145826b2
Preview (8b145826b2)
diff --git a/tests/integrations/openai_agents/test_openai_agents.py b/tests/integrations/openai_agents/test_openai_agents.py
--- a/tests/integrations/openai_agents/test_openai_agents.py
+++ b/tests/integrations/openai_agents/test_openai_agents.py
@@ -15,7 +15,7 @@
 from sentry_sdk.integrations.openai_agents.utils import _set_input_data, safe_serialize
 from sentry_sdk.utils import parse_version
 
-from openai import AsyncOpenAI, AsyncStream
+from openai import AsyncOpenAI
 from agents.models.openai_responses import OpenAIResponsesModel
 
 from unittest import mock
@@ -44,16 +44,12 @@
     ResponseCompletedEvent,
     Response,
     ResponseUsage,
-    ResponseStreamEvent,
 )
 from openai.types.responses.response_usage import (
     InputTokensDetails,
     OutputTokensDetails,
 )
 
-from openai._response import AsyncAPIResponse
-from openai._models import FinalRequestOptions
-
 test_run_config = agents.RunConfig(tracing_disabled=True)
 
 EXAMPLE_RESPONSE = Response(
This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants