-
Notifications
You must be signed in to change notification settings - Fork 15
Remove workaround/hook for GROMACS SVE issue #132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
bot: build repo:eessi.io-2023.06-software instance:eessi-bot-mc-aws for:arch=aarch64/neoverse_v1 |
|
New job on instance
|
|
New job on instance
|
|
That worked, so let's now check 2024.3 and 2024.4 for Grace as well, since the hooks had the following condition: bot: build repo:eessi.io-2023.06-software instance:eessi-bot-jsc for:arch=aarch64/nvidia/grace |
|
New job on instance
|
|
All builds succeeded, so this confirms that the hook is no longer required. I'll remove it, then this can be merged and deployed, and finally we can do rebuilds of these GROMACS versions in the software layer repo. |
|
Building the new hooks file for both neoverse v1 and grace, to make sure that we don't deploy the GROMACS build from an earlier build job in this PR. bot: build repo:eessi.io-2023.06-software instance:eessi-bot-mc-aws for:arch=aarch64/neoverse_v1 |
|
New job on instance
|
|
New job on instance
|
|
bot: build repo:eessi.io-2025.06-software instance:eessi-bot-mc-aws for:arch=aarch64/neoverse_v1 |
|
New job on instance
|
…emove_gromacs_sve_workaround
|
bot: build repo:eessi.io-2023.06-software instance:eessi-bot-mc-aws for:arch=aarch64/neoverse_v1 |
|
New job on instance
|
|
New job on instance
|
|
New job on instance
|
ocaisa
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
I guess this was accidentally built & deploy twice for EESSI version 2023.06, but no harm done since the |
It was built twice because I wanted to do test builds with GROMACS for both Neoverse V1 and Grace. To prevent a tarball with GROMACS from getting deployed, I did the final builds for both targets as well, otherwise I had to close the PR and make a new one. We could have set one tarball to |
First testing this for GROMACS 2024.1 and Neoverse V1 + Grace, but the hook was also applied for 2024.3+4 on Grace CPUs. So, if this works, I'll do test builds for the latter versions as well, as I'm not sure if those versions had the same issues as encountered on for instance A64FX (if so, they should be solved as well with the new easyconfig).