My DialoGPT Model

This is a fine-tuned version of microsoft/DialoGPT-small on custom data about Dominica.

Model Details

  • Model Name: DialoGPT-small
  • Training Data: Custom dataset about Dominica
  • Evaluation: Achieved eval_loss of 12.85

Usage

To use this model, you can load it as follows:

from transformers import AutoModelForCausalLM, AutoTokenizer

# Load the model and tokenizer
model_name = "unknownCode/IslandBoyRepo"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Generate a response
input_text = "What is the capital of Dominica?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(response)
Downloads last month
16
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.