Text Generation
Transformers
PyTorch
code
gpt2
custom_code
Eval Results
text-generation-inference
Inference Endpoints
loubnabnl HF staff commited on
Commit
cad6f98
1 Parent(s): 132eb6b

add note on fim tokens

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -243,6 +243,7 @@ inputs = tokenizer.encode(input_text, return_tensors="pt").to(device)
243
  outputs = model.generate(inputs)
244
  print(tokenizer.decode(outputs[0]))
245
  ```
 
246
 
247
  ### Load other checkpoints
248
  We upload the checkpoint of each experiment to a separate branch as well as the intermediate checkpoints as commits on the branches. You can load them with the `revision` flag:
 
243
  outputs = model.generate(inputs)
244
  print(tokenizer.decode(outputs[0]))
245
  ```
246
+ Make sure to use `<fim-prefix>, <fim-suffix>, <fim-middle>` and not `<fim_prefix>, <fim_suffix>, <fim_middle>` as in StarCoder models.
247
 
248
  ### Load other checkpoints
249
  We upload the checkpoint of each experiment to a separate branch as well as the intermediate checkpoints as commits on the branches. You can load them with the `revision` flag: