Edit model card

Model Card for Model ID

ParsBERT for Persian Question Answering

Model Description

this is a fine-tuned version of the ParsBERT model, specifically adapted for the task of question answering in Persian. ParsBERT is a BERT-based model pre-trained on a large Persian text corpus. This model has been fine-tuned on a Persian QA dataset to provide accurate and contextually relevant answers to questions posed in Persian.

Model Architecture

  • Base Model: ParsBERT
  • Task: Question Answering
  • Language: Persian
  • Number of Parameters: 110M

Intended Use

This model is intended for use in applications requiring natural language understanding and question answering in Persian, such as:

  • Persian language chatbots
  • Persian information retrieval systems
  • Educational tools for Persian language learners

Dataset

The model was fine-tuned on a Persian QA dataset. The dataset consists of question-answer pairs extracted from various Persian text sources, ensuring a diverse range of topics and contexts.

Usage

To use this model for question answering in Persian, you can load it using the Hugging Face Transformers library. Here’s a quick example:

from transformers import AutoTokenizer, AutoModelForQuestionAnswering, pipeline

# Load the tokenizer and model
tokenizer = AutoTokenizer.from_pretrained("mansoorhamidzadeh/parsbert-persian-QA")
model = AutoModelForQuestionAnswering.from_pretrained("mansoorhamidzadeh/parsbert-persian-QA")

# Create a QA pipeline
qa_pipeline = pipeline("question-answering", model=model, tokenizer=tokenizer)

# Example usage
context = "متن زمینه که شامل اطلاعات مرتبط با سوال شما است."
question = "سوال شما چیست؟"
result = qa_pipeline(question=question, context=context)

print(f"Answer: {result['answer']}")
Downloads last month
86
Safetensors
Model size
118M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train mansoorhamidzadeh/parsbert-persian-QA

Space using mansoorhamidzadeh/parsbert-persian-QA 1