-
Notifications
You must be signed in to change notification settings - Fork 576
ci: 🤖 Update test matrix with new releases (01/19) #5330
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Semver Impact of This PR🟢 Patch (bug fixes) 📋 Changelog PreviewThis is how your changes will appear in the changelog. New Features ✨
Bug Fixes 🐛Integrations
Litellm
Other
Documentation 📚
Internal Changes 🔧Release
Other
🤖 This preview updates automatically when you update the PR. |
| # ~~~ Agents ~~~ | ||
| openai_agents-v0.0.19: openai-agents==0.0.19 | ||
| openai_agents-v0.2.11: openai-agents==0.2.11 | ||
| openai_agents-v0.4.2: openai-agents==0.4.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: The new PySpark test environments for versions 3.5.8 and 4.1.1 are missing the required JAVA_HOME environment variable configuration in tox.ini.
Severity: CRITICAL
Suggested Fix
Update tox.ini to include the new PySpark versions in the setenv section. Add 3.5.8 to the existing Java 11 configuration and create a new entry for 4.1.1 pointing to a Java 17 installation. For example: spark-v{3.0.3,3.5.6,3.5.8}: JAVA_HOME=... and spark-v4.1.1: JAVA_HOME=....
Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.
Location: tox.ini#L400
Potential issue: The `tox.ini` file adds new test environments for PySpark `v3.5.8` and
`v4.1.1`. However, the configuration to set the `JAVA_HOME` environment variable, which
is essential for PySpark to locate the Java runtime, has not been updated for these new
versions. PySpark `v3.5.8` requires Java 8/11/17 and `v4.1.1` requires Java 17 or later.
Without `JAVA_HOME` being set, the integration tests for these new Spark versions will
fail in the CI pipeline with 'Java not found' errors, blocking the pull request.
Did we get this right? 👍 / 👎 to inform future reviews.
Update our test matrix with new releases of integrated frameworks and libraries.
How it works
Action required
🤖 This PR was automatically created using a GitHub action.