试图用onnxruntime在pychgarm上加载onnx模型的时候报错
Traceback (most recent call last):
File "D:\rubbish_bin\test_runtime.py", line 119, in <module>
net = ort.InferenceSession(model_pb_path, providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'])
File "F:\Anaconda3\envs\yolov5\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "F:\Anaconda3\envs\yolov5\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 395, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1029 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "F:\Anaconda3\envs\yolov5\lib\site-packages\onnxruntime\capi\onnxruntime_providers_tensorrt.dll"
在csdn上搜了一圈据说是dll无法找到的原因,但我还是不会改
其中参考过这个博客https://blog.csdn.net/jacke121/article/details/84111112
但是我没弄懂要怎么做
希望大家能帮帮我