Model Card for Model ID

Distilbert tokenizer trained on KazQAD

Model Details

Model Description

  • Model type: DistilBERT
  • Language(s) (NLP): Kazakh

Training Details

Training Data

https://github.com/IS2AI/KazQAD/

Environmental Impact

  • Hardware Type: TPUv2
  • Hours used: Less than a minute
  • Cloud Provider: Google Colab
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Dataset used to train dappyx/QazDistilbertFast-tokenizer