Skip to content

Commit

Permalink
docs(build): add more example (#2378)
Browse files Browse the repository at this point in the history
* docs(build): add more example

* Update linux-x86_64.md

* Update linux-x86_64.md

* docs(format): update install doc

* Update linux-x86_64.md

* Update linux-x86_64.md

* Update lint.yml

* fix lint

---------

Co-authored-by: RunningLeon <mnsheng@yeah.net>
  • Loading branch information
tpoisonooo and RunningLeon authored Sep 5, 2023
1 parent 58db0ad commit 468c423
Show file tree
Hide file tree
Showing 3 changed files with 58 additions and 2 deletions.
4 changes: 3 additions & 1 deletion .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,9 @@ jobs:
python -m pip install pre-commit
pre-commit install
- name: Linting
run: pre-commit run --all-files
run: |
pre-commit run --all-files
git diff
- name: Format c/cuda codes with clang-format
uses: DoozyX/clang-format-lint-action@v0.11
with:
Expand Down
27 changes: 27 additions & 0 deletions docs/en/01-how-to-build/linux-x86_64.md
Original file line number Diff line number Diff line change
Expand Up @@ -395,3 +395,30 @@ You can also activate other engines after the model.

make -j$(nproc) && make install
```

- cuda + TensorRT + onnxruntime + openvino + ncnn

If the [ncnn auto-install script](../../../tools/scripts/build_ubuntu_x64_ncnn.py) is used, protobuf will be installed in mmdeploy-dep/pbinstall in the same directory as mmdeploy.

```Bash
export PROTO_DIR=/path/to/mmdeploy-dep/pbinstall
cmake .. \
-DCMAKE_CXX_COMPILER=g++-7 \
-DMMDEPLOY_BUILD_SDK=ON \
-DMMDEPLOY_BUILD_EXAMPLES=ON \
-DMMDEPLOY_BUILD_SDK_PYTHON_API=ON \
-DMMDEPLOY_TARGET_DEVICES="cuda;cpu" \
-DMMDEPLOY_TARGET_BACKENDS="trt;ort;ncnn;openvino" \
-Dpplcv_DIR=${PPLCV_DIR}/cuda-build/install/lib/cmake/ppl \
-DTENSORRT_DIR=${TENSORRT_DIR} \
-DCUDNN_DIR=${CUDNN_DIR} \
-DONNXRUNTIME_DIR=${ONNXRUNTIME_DIR} \
-DInferenceEngine_DIR=${OPENVINO_DIR}/runtime/cmake \
-Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn \
-DProtobuf_LIBRARIES=${PROTO_DIR}/lib/libprotobuf.so \
-DProtobuf_PROTOC_EXECUTABLE=${PROTO_DIR}/bin/protoc \
-DProtobuf_INCLUDE_DIR=${PROTO_DIR}/pbinstall/include
```

```
```
29 changes: 28 additions & 1 deletion docs/zh_cn/01-how-to-build/linux-x86_64.md
Original file line number Diff line number Diff line change
Expand Up @@ -335,7 +335,7 @@ mim install -e .

#### 编译 SDK 和 Demos

下文展示2个构建SDK的样例,分别用 ONNXRuntime 和 TensorRT 作为推理引擎。您可以参考它们,激活其他的推理引擎。
下文展示一些构建 SDK 的样例。您可以参考它们,激活其他的推理引擎。

- cpu + ONNXRuntime

Expand Down Expand Up @@ -390,3 +390,30 @@ mim install -e .

make -j$(nproc) && make install
```

- cuda + TensorRT + onnxruntime + openvino + ncnn

如果使用了 [ncnn 自动安装脚本](../../../tools/scripts/build_ubuntu_x64_ncnn.py), protobuf 会安装在 mmdeploy 同级目录的 mmdeploy-dep/pbinstall 中。

```Bash
export PROTO_DIR=/path/to/mmdeploy-dep/pbinstall
cmake .. \
-DCMAKE_CXX_COMPILER=g++-7 \
-DMMDEPLOY_BUILD_SDK=ON \
-DMMDEPLOY_BUILD_EXAMPLES=ON \
-DMMDEPLOY_BUILD_SDK_PYTHON_API=ON \
-DMMDEPLOY_TARGET_DEVICES="cuda;cpu" \
-DMMDEPLOY_TARGET_BACKENDS="trt;ort;ncnn;openvino" \
-Dpplcv_DIR=${PPLCV_DIR}/cuda-build/install/lib/cmake/ppl \
-DTENSORRT_DIR=${TENSORRT_DIR} \
-DCUDNN_DIR=${CUDNN_DIR} \
-DONNXRUNTIME_DIR=${ONNXRUNTIME_DIR} \
-DInferenceEngine_DIR=${OPENVINO_DIR}/runtime/cmake \
-Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn \
-DProtobuf_LIBRARIES=${PROTO_DIR}/lib/libprotobuf.so \
-DProtobuf_PROTOC_EXECUTABLE=${PROTO_DIR}/bin/protoc \
-DProtobuf_INCLUDE_DIR=${PROTO_DIR}/pbinstall/include
```

```
```

0 comments on commit 468c423

Please sign in to comment.