We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Originally posted by Tyx-main December 31, 2024 rapidocr_onnxruntime可以新增对NPU的支持吗,目前似乎只支持 CPU、CUDA
The text was updated successfully, but these errors were encountered:
你好,我想你想要找的应该是这个,NPU在onnxruntiome里的定义是使用的CANNExecutionProvider,而不是直接用的npu 具体可参阅: https://onnxruntime.ai/docs/execution-providers/community-maintained/CANN-ExecutionProvider.html 我写了一个关于onnxruntime跑在NPU上的最小用例,目前已经跑通了,可参考:
import onnxruntime as ort import numpy as np import cv2 # 加载 ONNX 模型 options = ort.SessionOptions() session = ort.InferenceSession( '/app/resources/det.onnx', sess_options=options, providers=[ ( "CANNExecutionProvider", { "device_id": 0, "arena_extend_strategy": "kNextPowerOfTwo", "npu_mem_limit": 20 * 1024 * 1024 * 1024, "op_select_impl_mode": "high_performance", "optypelist_for_implmode": "Gelu", "enable_cann_graph": True }, ), ] ) image = np.random.rand(1, 3, 960, 608).astype(np.float32) # 准备输入 input_name = session.get_inputs()[0].name inputs = {input_name: image} # 进行推理 outputs = session.run(None, inputs) # 输出结果 print(outputs)
附上onnx权重: det.zip
以及我提供了一个在CANN8.0上编译的 onnxruntime-cann的whl包,装了这个包之后,需要卸载onnxruntime onnxruntime_cann-1.21.0-cp310-cp310-linux_aarch64.zip
Sorry, something went wrong.
SWHL
No branches or pull requests
Discussed in #313
Originally posted by Tyx-main December 31, 2024
rapidocr_onnxruntime可以新增对NPU的支持吗,目前似乎只支持 CPU、CUDA
The text was updated successfully, but these errors were encountered: