-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error found when run the code #3
Comments
I found that this error was related to "nvinferserver(Triton)".
After that I suddenly met typo erros like "lable.txt doesnt exists". I fixed it. And it works. If you want to use "nvinferserver(Triton)", other guys probably help us. |
@lmw0320 Check this:
|
Thanks for your reply. I run the code under DP6.1.1, and found new error ;ERROR: infer_trtis_server.cpp:1052 Triton: failed to load model yolov4, triton_err_str:Invalid argument, err_msg:load failed for model 'yolov4': version 1: Unavailable: unable to find '/opt/nvidia/deepstream/deepstream-6.1/sources/apps/sample_apps/deepstream_parallel_inference_app/tritonserver/models/yolov4/1/yolov4_-1_3_416_416_dynamic.onnx_b32_gpu0.engine' for model instance 'yolov4_0'; when running the command : ./apps/deepstream-parallel-infer/deepstream-parallel-infer -c configs/apps/bodypose_yolo_lpr/source4_1080p_dec_parallel_infer.yml |
Have you solved this problem? I have the same problem |
Did you install plugin nvinferserver? |
I just followed the steps of commands show in Readme:
git lfs install --skip-repo
git clone https://github.com/NVIDIA-AI-IOT/deepstream_parallel_inference_app.git
./build_engine.sh
4.cd tritonclient/sample/
source build.sh
./apps/deepstream-parallel-infer/deepstream-parallel-infer -c configs/apps/bodypose_yolo_lpr/source4_1080p_dec_parallel_infer.yml
I can run all the commands sucessfully except the last line, the errors are shown below:

The text was updated successfully, but these errors were encountered: