SAR_TED
- Add self-attention after RNN, thus the model is now named as "Self-Attention RNN (SAR)"
- Modify neg-cut strategy, high-assoc sta-date pairs are weighted down, so that is more suitable for intense sequences
- Add glitch removal algorithm to realize higher stability
- More suitable default parameters, should have ~2.75*PAL detection number