Skip to content

AssistantAgent with OpenRouter can't function call if there is a output_content_type #7132

@ChenFryd

Description

@ChenFryd

What happened?

Describe the bug
Open Router models can't produce function calls when have structred output.

To Reproduce

from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_core.models import ModelFamily

# Define a model client. You can use other model client that implements
# the `ChatCompletionClient` interface.
from autogen_core.tools import FunctionTool
from dotenv import load_dotenv
import os
load_dotenv()
from pydantic import BaseModel, Field

class weather(BaseModel):
    city: str = Field(
        ...,
        description="The city we get weather from"
    )
    temperature: int = Field(
        ...,
        description="Temperature in Farenheits"
    )

model_client = OpenAIChatCompletionClient(
        base_url=os.getenv("OPEN_ROUTER_BASE_URL"),
        api_key=os.getenv("OPEN_ROUTER_API_KEY"),
        model="openai/gpt-oss-20b:free",
        model_info={
            "vision": False,
            "function_calling": True,
            "json_output": True,
            "family": ModelFamily.UNKNOWN,
            "structured_output": True,
        }
    )


# Define a simple function tool that the agent can use.
# For this example, we use a fake weather tool for demonstration purposes.
async def get_weather(city: str) -> str:
    """Get the weather for a given city."""
    return f"The weather in {city} is 73 degrees and Sunny."


# Define an AssistantAgent with the model, tool, system message, and reflection enabled.
# The system message instructs the agent via natural language.

agent = AssistantAgent(
    name="weather_agent",
    model_client=model_client,
    tools=[FunctionTool(get_weather,description="get weather", strict=True)],
    output_content_type=weather,
    system_message="You are a helpful assistant. Use your function calls",
    reflect_on_tool_use=True,
    model_client_stream=True,  # Enable streaming tokens from the model client.
)


# Run the agent and stream the messages to the console.
async def main() -> None:
    await Console(agent.run_stream(task="What is the weather in New York?"))
    # Close the connection to the model client.
    await model_client.close()


# NOTE: if running this inside a Python script you'll need to use asyncio.run(main()).
import asyncio
asyncio.run(main())

Expected behavior
The agent to return
`{"city":"New York","temperature":73}

But it does not call the tool, so it will return whatever temperature

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
gpt-5-nano does able to use this function call

Which packages was the bug in?

Python AgentChat (autogen-agentchat>=0.4.0)

AutoGen library version.

Python dev (main branch)

Other library version.

No response

Model used

gpt-oss-20b, all open source models

Model provider

Other (please specify below)

Other model provider

All Open Source models

Python version

3.12

.NET version

None

Operating system

Ubuntu

Metadata

Metadata

Assignees

No one assigned

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions