-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Tracker] All the issue related with e2e shark test suite #812
Comments
Can you update the model List links? |
Could you also attach the issue links you referred to so we would know if we cover all model paths. Also it seems not including #801 right? |
@zjgarvey the model list contain the updated link only. @jinchen62 Yes, so far the report is based on onnx model of e2e shark test suite |
@pdhirajkumarprasad I think it would be helpful to attach more details of the error message. I feel like the |
Full ONNX FE tracker is at: #564
ONNX model Zoo model tracker : #886
HF model tracker : #899
Running model
In alt_e2e test suite:
Set environment variable
CACHE_DIR
to specify where to download model artifacts.If debugging compilation failures with local builds of torch-mlir or iree, please make sure the locally built tools are the ones being run by the commands (see commands log for a test). E.g., running
which iree-compile
should point to the local build directory.By default, the test runner doesn't use torch-mlir directly. If you'd like to use a local build of torch-mlir, make sure
torch-mlir-opt
is on your path and use therun.py
flag--torchtolinalg
to enable running the frontend passes throughtorch-mlir-opt
.Get the failing model's name and run:
After running the test, the
test-run/ModelName/detail/
directory should contain detailed error logs for stage failures. To rerun locally, you can copy and paste the corresponding script fromtest-run/ModelName/commands/
directory.For onnx/models/
CPU Compilation Failures
Last updated based on run https://github.com/nod-ai/e2eshark-reports/blob/main/2025-02-12/ci_reports_onnx/llvm-cpu/combined-reports/summary.md
version >= version_range.first && version <= version_range.second
failed: Warning: invalid versiong.get() != nullptr
failed: Warning: onnx version converter is unable to parse input modelimport and setup failures
setup failures:
import failures:
After triage, add to table and assign:
iree-compile
IREE project tracker: https://github.com/orgs/iree-org/projects/8/views/3
iree runtime
numerics
IREE EP only issues
iree-compile fails with ElementsAttr does not provide iteration facilities for type 'mlir::Attribute' on int8 models at QuantizeLinear op
low priority
issue no 828 Turbine Camp
Issue no 797 Ops not in model
The text was updated successfully, but these errors were encountered: