YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co./docs/hub/model-cards#model-card-metadata)

Shakespeare Language Model

A transformer decoder model trained from scratch on Shakespeare's works.

Model Details

  • Architecture: Custom Transformer Decoder
  • Parameters: 188,344,038
  • Context Length: 256 tokens
  • Hidden Size: 1024
  • Layers: 12
  • Attention Heads: 16
  • Best Loss: 0.0911

Usage

python from transformers import AutoTokenizer from model import TransformerDecoder Load model and tokenizer model = TransformerDecoder.from_pretrained("ninagala/shakespeare-decoder") tokenizer = AutoTokenizer.from_pretrained("ninagala/shakespeare-decoder") Generate text text = model.generate("To be or not to be", max_length=100) print(text)

Downloads last month
8
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Space using ninagala/shakespeare-decoder 1