Edit model card

This repo contains the fune-tuned version of ai-forever/rugpt3large_based_on_gpt2, which can generate poetry from keywords in style of Pushkin, Mayakovsky, Esenin, Blok and Tyutchev.

To use this model, you can do this:

from transformers import AutoTokenizer, AutoModelForCausalLM

def generate_poetry(input: str, model, num_beams=3):
  input = input if len(input) > 0 else tokenizer.bos_token
  input_ids = tokenizer.encode(input, return_tensors="pt").to(device)
  # Create an attention mask
  attention_mask = (input_ids != tokenizer.pad_token_id).float()

    # Set the pad_token_id
  tokenizer.pad_token_id = tokenizer.eos_token_id
  with torch.no_grad():
        out = model.generate(input_ids,
                            do_sample=True,
                            num_beams=num_beams,
                            temperature=2.0,
                            top_p=0.9,
                            max_length = 200,
                            eos_token_id=tokenizer.eos_token_id,
                            bos_token_id=tokenizer.bos_token_id,
                            attention_mask=attention_mask
                            ).to(device)
  return tokenizer.batch_decode(out, skip_special_tokens=True)[0]

path = 'AnyaSchen/rugpt3-large-keywords2poetry'
tokenizer = AutoTokenizer.from_pretrained(path)
model = AutoModelForCausalLM.from_pretrained(path).to(device)

inp = 'Автор: Маяковский\nКлючевые слова:<write your keywords>'
print(generate_poetry(inp, model))
Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.