|
--- |
|
library_name: transformers |
|
license: mit |
|
language: |
|
- ar |
|
base_model: |
|
- openai-community/gpt2 |
|
--- |
|
|
|
# Model Card for Model ID |
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
|
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. |
|
|
|
- **Developed by:** Bacem ETTEIB |
|
- **Funded by:** University of Luxembourg |
|
- **Model type:** Encoder only model |
|
- **Language(s) (NLP):** Tunisian dialect |
|
- **License:** MIT |
|
- **Finetuned from model:** GPT2 |
|
|
|
|
|
|
|
### Recommendations |
|
|
|
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> |
|
|
|
Fine tune on different downstream tasks such as sentiment analysis or dialect identitication. |
|
|
|
|
|
### Training Procedure |
|
|
|
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> |
|
Continual pretraining. |