File size: 758 Bytes
600aa9a
 
f84132c
 
 
 
 
 
7bb31a5
 
6225fdb
 
8e952f4
6225fdb
 
 
5b1c35e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
license: apache-2.0
language:
- en
metrics:
- rouge
tags:
- nanoT5
datasets:
- allenai/c4
---

[Google's T5-v1.1-base](https://huggingface.co./google/t5-v1_1-base) pre-trained for 24 hours (80k steps / 256 batch size) on a single GPU in [nanoT5](https://github.com/PiotrNawrot/nanoT5) library for efficient pre-training.

For more details about the model refer to the original [paper](https://arxiv.org/pdf/2002.05202.pdf) and original [model weights](https://huggingface.co./google/t5-v1_1-base).

It can be further fine-tuned on SuperNatural-Instructions dataset to achieve comparable performance to the same model pre-trained on 150x more data through "a combination of model and data parallelism [...] on slices of Cloud TPU Pods", each with 1024 TPUs.