by Akshay Bhat
Deep Video Analytics is a platform for indexing and extracting information from videos and images. With latest version of docker installed correctly, you can run Deep Video Analytics in minutes locally (even without a GPU) using a single command.
For installation instructions and overview please visit https://www.deepvideoanalytics.com and go through the presentation.
The standalone OCR example has been moved to /docs/experiments/ocr directory.
Deep Video Analytics implements a client-server architecture pattern, where clients can access state of the server via a REST API. For uploading, processing data, training models, performing queries, i.e. mutating the state clients can send DVAPQL (Deep Video Analytics Processing and Query Language) formatted as JSON. Each query represents a directed acyclic graph of operations.
Library | Link to the license |
---|---|
YAD2K | MIT License |
AdminLTE2 | MIT License |
FabricJS | MIT License |
Facenet | MIT License |
JSFeat | MIT License |
MTCNN | MIT License |
Insight Face | MIT License |
CRNN.pytorch | MIT License |
Original CRNN code by Baoguang Shi | MIT License |
Object Detector App using TF Object detection API | MIT License |
Plotly.js | MIT License |
Text Detection CTPN | MIT License |
SphereFace | MIT License |
Segment annotator | BSD 3-clause |
Youtube 8M feature extractor weights | Apache 2.0 |
LOPQ | Apache 2.0 |
Open Images Pre-trained network | Apache 2.0 |
Interval Tree | Apache 2.0 |
Library | Link to the license |
---|---|
faiss | BSD + PATENTS License |
dlib | Boost Software License |
- FFmpeg (not linked, called via a Subprocess)
- Tensorflow
- OpenCV
- Numpy
- Pytorch
- Docker
- LMDB
- Nvidia-docker
- Docker-compose
- All packages in requirements.txt
- All dependancies installed in CPU Dockerfile & GPU Dockerfile
Copyright 2016-2018, Akshay Bhat, All rights reserved.
Deep Video Analytics is nearing stable 1.0, we expect to release in Summer 2018. The license will be relaxed once a stable release version is reached. Please contact me for more information.