diff --git a/README.md b/README.md
index e982aec..615ed49 100644
--- a/README.md
+++ b/README.md
@@ -106,7 +106,6 @@
- [Multimodal Live Streaming](#multimodal-live-streaming)
- [Inference on Multiple GPUs](#inference-on-multiple-gpus)
- [Inference on Mac](#inference-on-mac)
- - [Deployment on Mobile Phone](#deployment-on-mobile-phone)
- [Efficient Inference with llama.cpp, ollama, vLLM](#efficient-inference-with-llamacpp-ollama-vllm)
- [Fine-tuning](#fine-tuning)
- [FAQs](#faqs)
@@ -2372,8 +2371,6 @@ PYTORCH_ENABLE_MPS_FALLBACK=1 python test.py
```
-### Deployment on Mobile Phone
-MiniCPM-V 2.0 can be deployed on mobile phones with Android operating systems. 🚀 Click [MiniCPM-V 2.0](https://github.com/OpenBMB/mlc-MiniCPM) to install apk.
### Efficient Inference with llama.cpp, ollama, vLLM
diff --git a/README_zh.md b/README_zh.md
index 6440de1..78b6d23 100644
--- a/README_zh.md
+++ b/README_zh.md
@@ -90,7 +90,6 @@
- [多模态流式交互](#多模态流式交互)
- [多卡推理](#多卡推理)
- [Mac 推理](#mac-推理)
- - [手机端部署](#手机端部署)
- [基于 llama.cpp、ollama、vLLM 的高效推理](#基于-llamacppollamavllm-的高效推理)
- [微调](#微调)
- [FAQs](#faqs)
@@ -2353,10 +2352,6 @@ PYTORCH_ENABLE_MPS_FALLBACK=1 python test.py
-### 手机端部署
-MiniCPM-V 2.0 可运行在Android手机上,点击[MiniCPM-V 2.0](https://github.com/OpenBMB/mlc-MiniCPM)安装apk使用;
-
-
### 基于 llama.cpp、ollama、vLLM 的高效推理
llama.cpp 用法请参考[我们的fork llama.cpp](https://github.com/OpenBMB/llama.cpp/tree/minicpmv-main/examples/llava/README-minicpmv2.6.md), 在iPad上可以支持 16~18 token/s 的流畅推理(测试环境:iPad Pro + M4)。
diff --git a/docs/minicpm_v2.md b/docs/minicpm_v2.md
index 9dcb5a0..777b7bb 100644
--- a/docs/minicpm_v2.md
+++ b/docs/minicpm_v2.md
@@ -292,3 +292,8 @@ We deploy MiniCPM-V 2.0 on end devices. The demo video is the raw screen recordi
|:-----------|:--:|:-----------:|:-------------------|:---------------:|
| MiniCPM-V 2.0 | GPU | 8 GB | Light version, balance the performance the computation cost. | [🤗](https://huggingface.co/openbmb/MiniCPM-V-2) [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2) |
| MiniCPM-V 1.0 | GPU | 7 GB | Lightest version, achieving the fastest inference. | [🤗](https://huggingface.co/openbmb/MiniCPM-V) [](https://modelscope.cn/models/OpenBMB/MiniCPM-V) |
+
+
+### Deployment on Mobile Phone
+
+MiniCPM-V 2.0 can be deployed on mobile phones with Android operating systems. 🚀 Click [MiniCPM-V 2.0](https://github.com/OpenBMB/mlc-MiniCPM) to install apk.
\ No newline at end of file