Pytorch implementation of SELF-ATTENTIVE VAD | Paper | Dataset
Yong Rae Jo, Youngki Moon, Won Ik Cho , and Geun Sik Jo
Voithru Inc., Inha University, Seoul National University.
2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Recent voice activity detection (VAD) schemes have aimed at leveraging the decent neural architectures, but few were successful with applying the attention network due to its high reliance on the encoder-decoder framework. This has often let the built systems have a high dependency on the re- current neural networks, which are costly and sometimes less context-sensitive considering the scale and property of acoustic frames. To cope with this issue with the self- attention mechanism and achieve a simple, powerful, and environment-robust VAD, we first adopt the self-attention architecture in building up the modules for voice detection and boosted prediction. Our model surpasses the previous neural architectures in view of low signal-to-ratio and noisy real-world scenarios, at the same time displaying the robust- ness regarding the noise types. We make the test labels on movie data publicly available for the fair competition and future progress.
$ git clone https://github.com/voithru/voice-activity-detection.git
$ cd voice-activity-detection
$ pip install -r requirements.txt
$ python main.py --help
$ python main.py train --help
Usage: main.py train [OPTIONS] CONFIG_PATH
$ python main.py evaluate --help
Usage: main.py evaluate [OPTIONS] EVAL_PATH CHECKPOINT_PATH
$ python main.py predict --help
Usage: main.py predict [OPTIONS] AUDIO_PATH CHECKPOINT_PATH
Figure. Test result - Noisex92
Figure. Test result - Real-world audio dataset
@INPROCEEDINGS{9413961,
author={Jo, Yong Rae and Ki Moon, Young and Cho, Won Ik and Sik Jo, Geun},
booktitle={ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
title={Self-Attentive VAD: Context-Aware Detection of Voice from Noise},
year={2021},
volume={},
number={},
pages={6808-6812},
doi={10.1109/ICASSP39728.2021.9413961}}