Loubna ben allal
add architecture info
d97087d
raw
history blame
288 Bytes
[InCoder](https://huggingface.co./facebook/incoder-6B) uses a decoder-only Transformer with [Causal Masking objective](https://arxiv.org/abs/2201.07520), to train a left-to-right language model to fill in masked token segments.
|Model | # parameters |
| - | - |
| Decoder |6.7B |