-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MIGraphX EP] [ROCm EP] Update CI to use ROCm 6.3.2 #23535
base: main
Are you sure you want to change the base?
[MIGraphX EP] [ROCm EP] Update CI to use ROCm 6.3.2 #23535
Conversation
/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline |
/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Win_TRT_Minimal_CUDA_Test_CI, Windows ARM64 QNN CI Pipeline, Windows CPU CI Pipeline, |
/azp run Windows GPU CUDA CI Pipeline, Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline |
Azure Pipelines successfully started running 6 pipeline(s). |
1 similar comment
Azure Pipelines successfully started running 6 pipeline(s). |
Azure Pipelines successfully started running 7 pipeline(s). |
/azp run Linux MIGraphX CI Pipeline |
Azure Pipelines successfully started running 1 pipeline(s). |
Need change the pipeline configuration: onnxruntime/tools/ci_build/github/azure-pipelines/linux-migraphx-ci-pipeline.yml Lines 39 to 42 in 28ee049
Note that ROCm EP uses another docker file and yml file. It is fine to only upgrade MigraphX CI in this PR. |
Updated |
/azp run Big Models, Linux Android Emulator QNN CI Pipeline, Linux CPU CI Pipeline, Linux CPU Minimal Build E2E CI Pipeline, Linux GPU CI Pipeline, Linux GPU TensorRT CI Pipeline |
/azp run Linux OpenVINO CI Pipeline, Linux QNN CI Pipeline, MacOS CI Pipeline, ONNX Runtime Web CI Pipeline, Win_TRT_Minimal_CUDA_Test_CI, Windows ARM64 QNN CI Pipeline, Windows CPU CI Pipeline, |
/azp run Windows GPU CUDA CI Pipeline, Windows GPU DML CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows GPU TensorRT CI Pipeline, Windows x64 QNN CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline |
/azp run Linux MIGraphX CI Pipeline |
Azure Pipelines successfully started running 6 pipeline(s). |
Azure Pipelines successfully started running 1 pipeline(s). |
Azure Pipelines successfully started running 6 pipeline(s). |
Azure Pipelines successfully started running 7 pipeline(s). |
/azp run Linux ROCm CI Pipeline |
Azure Pipelines successfully started running 1 pipeline(s). |
@TedThemistokleous, The ROCm CI Pipeline failed. It is because cupy does not support ROCm 6.3 yet. I think the solution is to remove cupy from the docker file, then set an environment variable onnxruntime/onnxruntime/python/tools/kernel_explorer/kernels/utils.py Lines 114 to 115 in d5338da
Updated: It is a little complicated so I sent a PR to update ROCm CI: #23542. |
Description
Update Onnxruntime CI to use latest ROCm release ROCm 6.3.2
Motivation and Context
Ensure CI is testing up to date with recent Onnxruntime. AMD validates changes based off the latest ROCm release and adds additional features on top to validate changes prior to being pushed to our internal testing branches and then up streamed to Microsoft/Onnxruntime:main . Right now Onnxruntime is testing things a few releases back for their CI