Skip to content

Conversation

@mecampbellsoup
Copy link

Summary

The OpenAI SDK v1+ returns Pydantic model instances (e.g., ResponseFunctionToolCall) which are not directly JSON serializable. When these objects are passed to truncate_messages_by_size(), the json.dumps() call fails with a TypeError:

TypeError: Object of type ResponseFunctionToolCall is not JSON serializable

The _normalize_data() helper already exists in this file and properly handles Pydantic objects by calling .model_dump(), but it wasn't being used in truncate_messages_by_size().

The Fix

This PR adds normalization at the start of truncate_messages_by_size() to ensure all Pydantic objects are converted to JSON-compatible dicts before serialization:

# Normalize messages to ensure JSON serialization works
# (handles Pydantic objects from OpenAI SDK v1+)
messages = _normalize_data(messages, unpack=False)

Reproduction

When using the OpenAI SDK's Responses API with tool calls and Sentry's OpenAI integration enabled:

from openai import OpenAI
from openai.types.responses import ResponseFunctionToolCall

client = OpenAI()
response = client.responses.create(
    model="gpt-4o",
    input=[
        {"role": "user", "content": "What's the weather?"},
        {
            "role": "assistant", 
            "content": [ResponseFunctionToolCall(
                type="function_call",
                id="call_123",
                call_id="call_123",
                name="get_weather",
                arguments='{"location": "NYC"}'
            )]
        }
    ]
)

This crashes with the TypeError because Sentry's integration tries to serialize the messages for tracing.

Fixes #5350

…truncate_messages_by_size

The OpenAI SDK v1+ returns Pydantic model instances (e.g., ResponseFunctionToolCall)
which are not directly JSON serializable. When these objects are passed to
truncate_messages_by_size(), the json.dumps() call fails with a TypeError.

The _normalize_data() helper already exists and properly handles Pydantic objects
by calling .model_dump(), but it wasn't being used in truncate_messages_by_size().

This fix adds normalization at the start of truncate_messages_by_size() to ensure
all Pydantic objects are converted to JSON-compatible dicts before serialization.

Fixes getsentry#5350
@mecampbellsoup mecampbellsoup requested a review from a team as a code owner January 20, 2026 17:42
@github-actions
Copy link
Contributor

Semver Impact of This PR

🟢 Patch (bug fixes)

📋 Changelog Preview

This is how your changes will appear in the changelog.
Entries from this PR are highlighted with a left border (blockquote style).


Bug Fixes 🐛

  • fix(openai): normalize Pydantic objects before JSON serialization in truncate_messages_by_size by mecampbellsoup in #5351

Internal Changes 🔧

  • ci: Fix path in AI integration tests by alexander-alderman-webb in #5347

🤖 This preview updates automatically when you update the PR.

@alexander-alderman-webb
Copy link
Contributor

Hi @mecampbellsoup,

Thanks for raising the bug, we're adjusting the trimming logic for the next release, so there will not be any serialization in the integrations any longer. See #5335 for details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OpenAI Integration: truncate_messages_by_size fails on Pydantic SDK objects (e.g., ResponseFunctionToolCall)

2 participants