Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
BME-TMIT
/
foszt2oszt
like
1
Follow
BME Department of Telecommunications and Media Informatics
2
Text2Text Generation
Transformers
PyTorch
Hungarian
encoder-decoder
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
foszt2oszt
/
special_tokens_map.json
makrai
add tokenizer
2893476
almost 3 years ago
raw
Copy download link
history
blame
contribute
delete
Safe
112 Bytes
{
"unk_token"
:
"[UNK]"
,
"sep_token"
:
"[SEP]"
,
"pad_token"
:
"[PAD]"
,
"cls_token"
:
"[CLS]"
,
"mask_token"
:
"[MASK]"
}