Text Generation
Transformers
PyTorch
TensorBoard
Safetensors
bloom
Eval Results
text-generation-inference
Inference Endpoints

A way to inference and fine-tune BLOOM-176B from Google Colab or locally

#152
by borzunov - opened
BigScience Workshop org
edited Dec 9, 2022

Hi everyone!

We've made a proof-of-concept of decentralized BLOOM-176B inference and fine-tuning in Google Colab. If you're interested in using the model without a GPU cluster, please take a look and tell us what you think of it: https://colab.research.google.com/drive/1Ervk6HPNS6AYVr3xVdQnY5a-TjjmLCdQ?usp=sharing

borzunov changed discussion title from A way to run BLOOM-176B inference and fine-tuning from Google Colab to A way to inference and fine-tune BLOOM-176B from Google Colab
BigScience Workshop org

quite cool!

Whoa! Nice!

borzunov changed discussion title from A way to inference and fine-tune BLOOM-176B from Google Colab to A way to inference and fine-tune BLOOM-176B from Google Colab or locally

Sign up or log in to comment