Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
GermanT5
/
t5-efficient-gc4-all-german-large-nl36
like
5
Follow
GermanT5
5
Text2Text Generation
Transformers
PyTorch
TensorBoard
Safetensors
German
t5
german
deutsch
text-generation-inference
Inference Endpoints
License:
mit
Model card
Files
Files and versions
Metrics
Training metrics
Community
1
Train
Deploy
Use this model
refs/pr/1
t5-efficient-gc4-all-german-large-nl36
3 contributors
History:
6 commits
SFconvertbot
Adding `safetensors` variant of this model
d816ec9
over 1 year ago
.gitattributes
1.76 kB
model: add original and converted checkpoints
almost 2 years ago
README.md
1.95 kB
readme: add initial version
almost 2 years ago
config.json
731 Bytes
model: add original and converted checkpoints
almost 2 years ago
events.out.tfevents.1670714423.t5
153 MB
LFS
metrics: add TensorBoard logs
almost 2 years ago
model.ckpt-524288.data-00000-of-00004
8 Bytes
model: add original and converted checkpoints
almost 2 years ago
model.ckpt-524288.data-00001-of-00004
729 MB
LFS
model: add original and converted checkpoints
almost 2 years ago
model.ckpt-524288.data-00002-of-00004
729 MB
LFS
model: add original and converted checkpoints
almost 2 years ago
model.ckpt-524288.data-00003-of-00004
730 MB
LFS
model: add original and converted checkpoints
almost 2 years ago
model.ckpt-524288.index
30.9 kB
model: add original and converted checkpoints
almost 2 years ago
model.ckpt-524288.meta
39.3 MB
LFS
model: add original and converted checkpoints
almost 2 years ago
model.safetensors
4.36 GB
LFS
Adding `safetensors` variant of this model
over 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
4.36 GB
LFS
model: add original and converted checkpoints
almost 2 years ago
special_tokens_map.json
1.79 kB
tokenizer: add initial version
almost 2 years ago
spiece.model
823 kB
LFS
tokenizer: add initial version
almost 2 years ago
tokenizer.json
2.46 MB
tokenizer: add initial version
almost 2 years ago
tokenizer_config.json
1.92 kB
tokenizer: add initial version
almost 2 years ago