Edit model card

Jasmine-350M

JASMINE: Arabic GPT Models for Few-Shot Learning

This is the repository accompanying our EMNLP2023 paper JASMINE: Arabic GPT Models for Few-Shot Learning. JASMINE is a suite of powerful Arabic autoregressive Transformer language models ranging in size between 300 million-6.7 billion parameters pretrained on a large and diverse dataset ( 235 GB of text).

BibTex

If you use Jasmine models for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):

@inproceedings{billah-nagoudi-etal-2023-jasmine,
    title = "{JASMINE}: {A}rabic {GPT} Models for Few-Shot Learning",
    author = "Billah Nagoudi, El Moatez  and
      Abdul-Mageed, Muhammad  and
      Elmadany, AbdelRahim  and
      Inciarte, Alcides  and
      Islam Khondaker, Md Tawkat",
    editor = "Bouamor, Houda  and
      Pino, Juan  and
      Bali, Kalika",
    booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
    month = dec,
    year = "2023",
    address = "Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.emnlp-main.1040",
    doi = "10.18653/v1/2023.emnlp-main.1040",
    pages = "16721--16744",
}

Acknowledgments

We gratefully acknowledge support from the Natural Sciences and Engineering Research Council of Canada, the Social Sciences and Humanities Research Council of Canada, Canadian Foundation for Innovation, ComputeCanada and UBC ARC-Sockeye. We also thank the Google TensorFlow Research Cloud (TFRC) program for providing us with free TPU access.

Downloads last month
477
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.