typo: encoder-encoder -> encoder-decoder
#1
by
dleve123
- opened
BART is an encoder-decoder, not an encoder-encoder.
It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes.
(https://arxiv.org/abs/1910.13461)
Great thanks!
patrickvonplaten
changed pull request status to
merged