Skip to content

Commit e58f9ae

Browse files
committed
chore(examples): bump module_client.py to gpt-5.5
The module-level client example still uses bare gpt-4. gpt-5.5 is the current default text and reasoning model per the openai-cookbook ("use gpt-5.5 for the strongest code review accuracy") and the openai-agents-python docs (voice/quickstart.md, reasoning_content example), and is documented as the latest default in openai/codex's latest-model.md. The model: parameter is typed Union[str, ChatModel] so this works even though Stainless has not yet synced gpt-5.5 into the ChatModel literal here. The example demonstrates module-level client configuration, the model choice is incidental.
1 parent 38d75d7 commit e58f9ae

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

examples/module_client.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99

1010
# all API calls work in the exact same fashion as well
1111
stream = openai.chat.completions.create(
12-
model="gpt-4",
12+
model="gpt-5.5",
1313
messages=[
1414
{
1515
"role": "user",

0 commit comments

Comments
 (0)