|
3 | 3 |
|
4 | 4 | <img src="assets/demo.gif" /> |
5 | 5 |
|
| 6 | +## News |
| 7 | + - (2020.11.28) Docker container is now supported on Ubuntu 18.04! |
| 8 | + |
6 | 9 | ## Description |
7 | 10 | Fast MOT is a multiple object tracker that implements: |
8 | 11 | - YOLO detector |
@@ -36,37 +39,28 @@ This means even though the tracker runs much faster, it is still highly accurate |
36 | 39 | - PyCuda |
37 | 40 | - Numpy >= 1.15 |
38 | 41 | - Scipy >= 1.5 |
39 | | -- TensorFlow <= 1.15.2 (for SSD support) |
40 | | -- Numba >= 0.48 |
| 42 | +- TensorFlow < 2.0 (for SSD support) |
| 43 | +- Numba == 0.48 |
41 | 44 | - cython-bbox |
42 | 45 |
|
43 | 46 | ### Install for Jetson (TX2/Xavier NX/Xavier) |
44 | | -Make sure to have [JetPack 4.4](https://developer.nvidia.com/embedded/jetpack) installed and run the script |
| 47 | +Make sure to have [JetPack 4.4](https://developer.nvidia.com/embedded/jetpack) installed and run the script: |
45 | 48 | ``` |
46 | 49 | $ scripts/install_jetson.sh |
47 | 50 | ``` |
48 | 51 | ### Install for Ubuntu 18.04 |
49 | | -Make sure to have [CUDA](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html), [cuDNN](https://docs.nvidia.com/deeplearning/cudnn/install-guide/index.html), and [TensorRT](https://docs.nvidia.com/deeplearning/tensorrt/install-guide/index.html#downloading) (including Python API) installed. You can optionally use my script to install from scratch |
50 | | - ``` |
51 | | - $ scripts/install_tensorrt.sh |
52 | | - ``` |
53 | | -Install UFF and Graph Surgeon for SSD support: https://github.com/GeekAlexis/FastMOT/issues/15#issuecomment-717045972 |
54 | | - |
55 | | -Build OpenCV from source with GStreamer (optional). GStreamer is recommended for performance. Modify `ARCH_BIN` [here](https://github.com/GeekAlexis/FastMOT/blob/0e9cb21cef5e36b1b9b0c41ae22adeeb110166bc/scripts/install_opencv.sh#L4) to match your [GPU compute capability](https://developer.nvidia.com/cuda-gpus#compute) |
56 | | - ``` |
57 | | - $ scripts/install_opencv.sh |
58 | | - ``` |
59 | | - |
60 | | -Install Python dependencies |
| 52 | +Make sure to have [nvidia-docker](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker) installed. The image requires an NVIDIA Driver version >= 450. Build and run the docker image: |
61 | 53 | ``` |
62 | | - $ pip3 install -r requirements.txt |
| 54 | + $ docker build -t fastmot:latest . |
| 55 | + $ docker run --rm --gpus all -it -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=unix$DISPLAY fastmot:latest |
63 | 56 | ``` |
64 | 57 | ### Download models |
65 | 58 | This includes both pretrained OSNet, SSD, and my custom YOLOv4 ONNX model |
66 | 59 | ``` |
67 | 60 | $ scripts/download_models.sh |
68 | 61 | ``` |
69 | 62 | ### Build YOLOv4 TensorRT plugin |
| 63 | +Modify `compute` [here](https://github.com/GeekAlexis/FastMOT/blob/2296fe414ca6a9515accb02ff88e8aa563ed2a05/fastmot/plugins/Makefile#L21) to match your [GPU compute capability](https://developer.nvidia.com/cuda-gpus#compute) |
70 | 64 | ``` |
71 | 65 | $ cd fastmot/plugins |
72 | 66 | $ make |
|
0 commit comments