-
Notifications
You must be signed in to change notification settings - Fork 132
ci(integration): update integration LLM matrix to gpt-5.2-codex #1503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…place GPT-5.1 Codex Max with GPT-5.2 Codex in integration-runner.yml matrix so CI exercises the latest codex family.\n\nCo-authored-by: openhands <[email protected]>
all-hands-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
xingyaoww
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!! (I was messing around the bot account and forget to log out 😓 )
|
Hi! I started running the integration tests on your PR. You will receive a comment with the results shortly. |
🧪 Integration Tests ResultsOverall Success Rate: 94.1% 📁 Detailed Logs & ArtifactsClick the links below to access detailed agent/LLM logs showing the complete reasoning process for each model. On the GitHub Actions page, scroll down to the 'Artifacts' section to download the logs.
📊 Summary
📋 Detailed Resultslitellm_proxy_mistral_devstral_2512
Skipped Tests:
Failed Tests:
litellm_proxy_moonshot_kimi_k2_thinking
Skipped Tests:
litellm_proxy_deepseek_deepseek_chat
Skipped Tests:
litellm_proxy_claude_sonnet_4_5_20250929
litellm_proxy_gpt_5.1_codex_max
Failed Tests:
litellm_proxy_vertex_ai_gemini_3_pro_preview
|
|
We probably need to merge it to test it |
Summary
Verification
litellm_proxy/gpt-5.2-codex. The run reached the proxy and attempted to use the model, confirming model id resolves at the proxy; however, access togpt-5.2-codexmay depend on environment credentials. In CI, the integration workflow uses the managed LLM proxy + key.Notes
Co-authored-by: openhands [email protected]
@enyst can click here to continue refining the PR
Agent Server images for this PR
• GHCR package: https://github.com/OpenHands/agent-sdk/pkgs/container/agent-server
Variants & Base Images
eclipse-temurin:17-jdknikolaik/python-nodejs:python3.12-nodejs22golang:1.21-bookwormPull (multi-arch manifest)
# Each variant is a multi-arch manifest supporting both amd64 and arm64 docker pull ghcr.io/openhands/agent-server:7743988-pythonRun
All tags pushed for this build
About Multi-Architecture Support
7743988-python) is a multi-arch manifest supporting both amd64 and arm647743988-python-amd64) are also available if needed