Latest Wheel Packages
November 22, 2021 ยท View on GitHub
Paddle-Serving-Server (x86 CPU/GPU)
Binary Package
for most users, we do not need to read this section. But if you deploy your Paddle Serving on a machine without network, you will encounter a problem that the binary executable tar file cannot be downloaded. Therefore, here we give you all the download links for various environment.
How to setup SERVING_BIN offline?
- download the serving server whl package and bin package, and make sure they are for the same environment
- download the serving client whl and serving app whl, pay attention to the Python version.
pip installthe serving andtar xfthe binary package, thenexport SERVING_BIN=$PWD/serving-gpu-cuda11-0.0.0/serving(take Cuda 11 as the example)
paddle-serving-client
paddle-serving-app
| develop whl | stable whl | |
|---|---|---|
| Python3 | paddle_serving_app-0.0.0-py3-none-any.whl | paddle_serving_app-0.7.0-py3-none-any.whl |
Baidu Kunlun user
for kunlun user who uses arm-xpu or x86-xpu can download the wheel packages as follows. Users should use the xpu-beta docker DOCKER IMAGES We only support Python 3.6 for Kunlun Users.
Wheel Package Links
for arm kunlun user
https://paddle-serving.bj.bcebos.com/whl/xpu/0.7.0/paddle_serving_server_xpu-0.7.0.post2-cp36-cp36m-linux_aarch64.whl
https://paddle-serving.bj.bcebos.com/whl/xpu/0.7.0/paddle_serving_client-0.7.0-cp36-cp36m-linux_aarch64.whl
https://paddle-serving.bj.bcebos.com/whl/xpu/0.7.0/paddle_serving_app-0.7.0-cp36-cp36m-linux_aarch64.whl
for x86 kunlun user
https://paddle-serving.bj.bcebos.com/whl/xpu/0.7.0/paddle_serving_server_xpu-0.7.0.post2-cp36-cp36m-linux_x86_64.whl
https://paddle-serving.bj.bcebos.com/whl/xpu/0.7.0/paddle_serving_client-0.7.0-cp36-cp36m-linux_x86_64.whl
https://paddle-serving.bj.bcebos.com/whl/xpu/0.7.0/paddle_serving_app-0.7.0-cp36-cp36m-linux_x86_64.whl