Skip to content

paddle_inference.dll never works on windows 10 #72377

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
SpaceView opened this issue Apr 21, 2025 · 2 comments
Open

paddle_inference.dll never works on windows 10 #72377

SpaceView opened this issue Apr 21, 2025 · 2 comments
Assignees

Comments

@SpaceView
Copy link

SpaceView commented Apr 21, 2025

bug描述 Describe the Bug

在windows10上,Legion笔记本Intel(R) Core(TM) i9-14900HX 2.20 GHz, 64G内存,python版运行正常,想用c++部署,结果 无论用哪个版本的paddle_inference都不行,
vs2022,vs2019都试过,vs2019会抛出bad allocation的错误,vs2022不会但照样不成功;
config.SetModel中,无论输入const char*, char[],还是std::string, 结果都一样;

比如,
paddle_inference_CUDA12.3_cuDNN9.0_TensorRT8.6_MKL
paddle_inference_CUDA11.8_cuDNN8.6_TensorRT8.5_MKL
写一了个非常简单的测试,如下,

#include <include/ocr_det.h>
#include <paddle_inference_api.h>
int minitest() {
paddle_infer::Config config;
std::string str1 = "D:/aiModels/paddle/PP-OCRv3/en_PP-OCRv3_det_infer/inference.pdmodel";
std::string str2 = "D:/aiModels/paddle/PP-OCRv3/en_PP-OCRv3_det_infer/inference.pdiparams";
try {
config.SetModel(str1.c_str(), str2.c_str());
}
catch (std::exception e) {
std::cout << e.what() << std::endl;
}

if (config.model_dir().empty()) {  // 检查是否设置成功
    std::cerr << "Error: Model path not set correctly!" << std::endl;
    std::cerr << "Attempted paths:\n"
        << "Model: " << str1 << "\n"
        << "Params: " << str2 << std::endl;
    return -1;
}

std::cout << "Model Dir: " << config.model_dir() << std::endl;
std::cout << "Params File: " << config.params_file() << std::endl;

auto predictor = paddle_infer::CreatePredictor(config);
if (!predictor) {
    std::cerr << "Failed to create predictor!" << std::endl;
    return -1;
}
std::cout << "Success!" << std::endl;
return 0;

}
输出结果 如下,
Model Dir: 烫烫烫烫
Params File: (此处直接崩溃)
貌似无论如何都无法正常完成config.SetModel(str1.c_str(), str2.c_str());用config.SetModel(str1, str2);啥的都一样。
想自己c++重新编译,报错实在太多了,编译不成功,各位大虾有什么高招吗?我只是想C++部署

其他补充信息 Additional Supplementary Information

No response

@EmmonsCurse
Copy link

EmmonsCurse commented Apr 21, 2025

报错可能跟 C++ 预测库缺少某些依赖或加载失败导致,辛苦确认一下 C++ 使用的预测库版本与 Python 是否完全一致,
即 python 包安装方法:python -m pip install paddlepaddle-gpu==3.0.0 -i https://www.paddlepaddle.org.cn/packages/stable/cu118/

C++ 预测库使用这个:https://paddle-qa.bj.bcebos.com/paddle-pipeline/Release-Debug-TagBuild-Infer-Windows-Gpu-Cuda11.8-Cudnn8.9-Trt8.6-Mkl-Avx-VS2019-SelfBuiltPypiUse/latest/paddle_inference.zip

可以用这两个试试~

@raoyutian
Copy link

我C++3.0版本CPU版本,本机win10,可以正常推理。但3.0GPU版,本机无法无法运行,直接报错不适合本系统。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants