typo: encoder-encoder -> encoder-decoder
#1
by
dleve123
- opened
BART is an encoder-decoder, not an encoder-encoder.
It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes.
for @patrickvonplaten probably?
Same doc fix for base model is here: https://huggingface.co./facebook/bart-base/discussions/1
Thanks!
patrickvonplaten
changed pull request status to
merged