We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
服务端正常启动 Going to Run Comand /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle_serving_server/serving-cpu-avx-mkl-0.9.0/serving -enable_model_toolkit -inferservice_path workdir_9393 -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 4 -port 9393 -precision fp32 -use_calib=False -reload_interval_s 10 -resource_path workdir_9393 -resource_file resource.prototxt -workflow_path workdir_9393 -workflow_file workflow.prototxt -bthread_concurrency 4 -max_body_size 536870912 I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDetectionOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDistKVInferOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDistKVQuantInferOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralFeatureExtractOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralInferOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralPicodetOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralReaderOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralRecOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralRemoteOp I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralResponseOp I0100 00:00:00.000000 13001 service_manager.h:79] RAW: Service[LoadGeneralModelService] insert successfully! I0100 00:00:00.000000 13001 load_general_model_service.pb.h:333] RAW: Success regist service[LoadGeneralModelService][PN5baidu14paddle_serving9predictor26load_general_model_service27LoadGeneralModelServiceImplE] I0100 00:00:00.000000 13001 service_manager.h:79] RAW: Service[GeneralModelService] insert successfully! I0100 00:00:00.000000 13001 general_model_service.pb.h:1650] RAW: Success regist service[GeneralModelService][PN5baidu14paddle_serving9predictor13general_model23GeneralModelServiceImplE] I0100 00:00:00.000000 13001 factory.h:155] RAW: Succ insert one factory, tag: PADDLE_INFER, base type N5baidu14paddle_serving9predictor11InferEngineE W0100 00:00:00.000000 13001 paddle_engine.cpp:34] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine->::baidu::paddle_serving::predictor::InferEngine, tag: PADDLE_INFER in macro! --- Running analysis [ir_graph_build_pass] --- Running analysis [ir_graph_clean_pass] --- Running analysis [ir_analysis_pass] --- Running analysis [ir_params_sync_among_devices_pass] --- Running analysis [adjust_cudnn_workspace_size_pass] --- Running analysis [inference_op_replace_pass] --- Running analysis [memory_optimize_pass] --- Running analysis [ir_graph_to_program_pass] C++ Serving service started successfully!
使用http方式进行预测时,报错 aistudio@jupyter-3310911-5426973:/Serving$ curl -XPOST http://0.0.0.0:9393/GeneralModelService/inference -d ' {"x":"/home/aistudio/000004.png"}' [10.36.12.114:9393][E-5100]InferService inference failed!aistudio@jupyter-3310911-5426973:/Serving$
支持https方式的调用吗?
The text was updated successfully, but these errors were encountered:
Message that will be displayed on users' first issue
Sorry, something went wrong.
错误日志 aistudio@jupyter-3310911-5426973:~/Paddle3D/log$ cat serving.ERROR Log file created at: 2023/02/06 15:51:26 Running on machine: jupyter-3310911-5426973 Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg E0206 15:51:26.136247 40320 general_reader_op.cpp:83] (logid=0) Failed get feed_var, var_num=0 E0206 15:51:26.136565 40320 dag_view.cpp:193] (logid=0) Execute failed, Op:Not implemented! E0206 15:51:26.136582 40320 dag_view.cpp:166] (logid=0) Failed execute stage[TOBE IMPLEMENTED! E0206 15:51:26.136596 40320 service.cpp:242] (logid=0) Failed execute dag for workflow:workflow1 E0206 15:51:26.136613 40320 service.cpp:194] (logid=0) Failed execute 0-th workflow in:GeneralModelService
No branches or pull requests
服务端正常启动
Going to Run Comand
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle_serving_server/serving-cpu-avx-mkl-0.9.0/serving -enable_model_toolkit -inferservice_path workdir_9393 -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 4 -port 9393 -precision fp32 -use_calib=False -reload_interval_s 10 -resource_path workdir_9393 -resource_file resource.prototxt -workflow_path workdir_9393 -workflow_file workflow.prototxt -bthread_concurrency 4 -max_body_size 536870912
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDetectionOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDistKVInferOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDistKVQuantInferOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralFeatureExtractOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralInferOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralPicodetOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralReaderOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralRecOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralRemoteOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralResponseOp
I0100 00:00:00.000000 13001 service_manager.h:79] RAW: Service[LoadGeneralModelService] insert successfully!
I0100 00:00:00.000000 13001 load_general_model_service.pb.h:333] RAW: Success regist service[LoadGeneralModelService][PN5baidu14paddle_serving9predictor26load_general_model_service27LoadGeneralModelServiceImplE]
I0100 00:00:00.000000 13001 service_manager.h:79] RAW: Service[GeneralModelService] insert successfully!
I0100 00:00:00.000000 13001 general_model_service.pb.h:1650] RAW: Success regist service[GeneralModelService][PN5baidu14paddle_serving9predictor13general_model23GeneralModelServiceImplE]
I0100 00:00:00.000000 13001 factory.h:155] RAW: Succ insert one factory, tag: PADDLE_INFER, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 13001 paddle_engine.cpp:34] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine->::baidu::paddle_serving::predictor::InferEngine, tag: PADDLE_INFER in macro!
--- Running analysis [ir_graph_build_pass]
--- Running analysis [ir_graph_clean_pass]
--- Running analysis [ir_analysis_pass]
--- Running analysis [ir_params_sync_among_devices_pass]
--- Running analysis [adjust_cudnn_workspace_size_pass]
--- Running analysis [inference_op_replace_pass]
--- Running analysis [memory_optimize_pass]
--- Running analysis [ir_graph_to_program_pass]
C++ Serving service started successfully!
使用http方式进行预测时,报错
aistudio@jupyter-3310911-5426973:
/Serving$ curl -XPOST http://0.0.0.0:9393/GeneralModelService/inference -d ' {"x":"/home/aistudio/000004.png"}'/Serving$[10.36.12.114:9393][E-5100]InferService inference failed!aistudio@jupyter-3310911-5426973:
支持https方式的调用吗?
The text was updated successfully, but these errors were encountered: