Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -999,8 +999,24 @@ git clone https://github.com/rednote-hilab/dots.ocr.git
cd dots.ocr

# Install pytorch, see https://pytorch.org/get-started/previous-versions/ for your cuda version
# cuda 12.8
pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cu128
# cuda 11.8
# pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cu11.8

# Install vllm with cuda 11.8 using local, default cuda 12.8
# pip install vllm==0.9.1 --extra-index-url https://download.pytorch.org/whl/cu118

# Install vllm=0.10.0 with cuda 12.8, flash_attn=2.8.2 with cuda 12.8, NVIDIA-5090 has been passed
# see https://github.com/Dao-AILab/flash-attention/releases for flash_attn and https://github.com/vllm-project/vllm/releases for vllm
pip install torch==2.7.1 torchvision==0.22.0 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cu128
pip install vllm==0.10.0 --extra-index-url https://download.pytorch.org/whl/cu128
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.2/flash_attn-2.8.2+cu12torch2.7cxx11abiFALSE-cp312-cp312-linux_x86_64.whl

# Install other packages, the previously installed packages need to be commented out..
pip install -e .


```

If you have trouble with the installation, try our [Docker Image](https://hub.docker.com/r/rednotehilab/dots.ocr) for an easier setup, and follow these steps:
Expand Down