Simple from-scratch implementations of transformer-based models that match the state of the art.
-
Updated
Dec 8, 2022 - Python
Simple from-scratch implementations of transformer-based models that match the state of the art.
Code Base for Transformer-XL on Finnish Language
Implementation of Transformer-XL in Tensorflow 2.0.
A Julia-based implementation of XLNet: A Generalized Autoregressive Pretraining for Language Understanding. < Flux | JuliaText >
search engine based on feature extracted on pretrained transformerXL.
Skriptejä kielimallin kouluttamiseksi ja puppugenerointiin
Google Colab based on a fork of REMI repo. Try this incredible REMI Pop Music Transformer easily and quickly :)
Challenging Memory-based Deep Reinforcement Learning Agents
Absolutely fantastic and fully working SOTA Transformer-XL Music AI implementation MahlerNet by Elias Lousseief
A refactored version of kimiyoung / transformer-xl/tf
A lightweight PyTorch implementation of the Transformer-XL architecture proposed by Dai et al. (2019)
Fair quantitative comparison of NLP embeddings from GloVe to RoBERTa with Sequential Bayesian Optimization fine-tuning using Flair and SentEval. Extension of HyperOpt library to log_b priors.
custom cuda kernel for {2, 3}d relative attention with pytorch wrapper
YAI 11 x @POZAlabs : Improving & Evaluating Music Generation with ComMU
[ACL‘20] Highway Transformer: A Gated Transformer.
Symbolic music generation taking inspiration from NLP and human composition process
Music and text generation with Transformer-XL.
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
Absolutely amazing SOTA Google Colab (Jupyter) Notebooks for creating/training SOTA Music AI models and for generating music with Transformer technology (Google XLNet/Transformer-XL)
Add a description, image, and links to the transformer-xl topic page so that developers can more easily learn about it.
To associate your repository with the transformer-xl topic, visit your repo's landing page and select "manage topics."