File size: 4,957 Bytes
043271b
a35052c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
043271b
 
 
 
 
 
a604cea
043271b
a604cea
043271b
a604cea
043271b
 
a604cea
61024e1
 
1e0f294
61024e1
 
1e0f294
61024e1
1e0f294
61024e1
 
1e0f294
61024e1
1e0f294
 
043271b
a35052c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
---
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language: 
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
tags:
- llm-rs
- ggml
pipeline_tag: text-generation
---

# GGML covnerted Models of [BigScience](https://huggingface.co./bigscience)'s Bloom models

## Description

BLOOM is an autoregressive Large Language Model (LLM), trained to continue text from a prompt on vast amounts of text data using industrial-scale computational resources. As such, it is able to output coherent text in 46 languages and 13 programming languages that is hardly distinguishable from text written by humans. BLOOM can also be instructed to perform text tasks it hasn't been explicitly trained for, by casting them as text generation tasks.


## Converted Models
| Name                                                                                                         | Based on                                                              | Type   | Container   | GGML Version   |
|:-------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------|:-------|:------------|:---------------|
| [bloom-1b7-f16.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-1b7-f16.bin)               | [bigscience/bloom-1b7](https://huggingface.co./bigscience/bloom-1b7)   | F16    | GGML        | V3             |
| [bloom-1b7-q4_0.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-1b7-q4_0.bin)             | [bigscience/bloom-1b7](https://huggingface.co./bigscience/bloom-1b7)   | Q4_0   | GGML        | V3             |
| [bloom-1b7-q4_0-ggjt.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-1b7-q4_0-ggjt.bin)   | [bigscience/bloom-1b7](https://huggingface.co./bigscience/bloom-1b7)   | Q4_0   | GGJT        | V3             |
| [bloom-1b7-q5_1-ggjt.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-1b7-q5_1-ggjt.bin)   | [bigscience/bloom-1b7](https://huggingface.co./bigscience/bloom-1b7)   | Q5_1   | GGJT        | V3             |
| [bloom-3b-f16.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-3b-f16.bin)                 | [bigscience/bloom-3b](https://huggingface.co./bigscience/bloom-3b)     | F16    | GGML        | V3             |
| [bloom-3b-q4_0.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-3b-q4_0.bin)               | [bigscience/bloom-3b](https://huggingface.co./bigscience/bloom-3b)     | Q4_0   | GGML        | V3             |
| [bloom-3b-q4_0-ggjt.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-3b-q4_0-ggjt.bin)     | [bigscience/bloom-3b](https://huggingface.co./bigscience/bloom-3b)     | Q4_0   | GGJT        | V3             |
| [bloom-3b-q5_1-ggjt.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-3b-q5_1-ggjt.bin)     | [bigscience/bloom-3b](https://huggingface.co./bigscience/bloom-3b)     | Q5_1   | GGJT        | V3             |
| [bloom-560m-f16.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-560m-f16.bin)             | [bigscience/bloom-560m](https://huggingface.co./bigscience/bloom-560m) | F16    | GGML        | V3             |
| [bloom-560m-q4_0.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-560m-q4_0.bin)           | [bigscience/bloom-560m](https://huggingface.co./bigscience/bloom-560m) | Q4_0   | GGML        | V3             |
| [bloom-560m-q4_0-ggjt.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-560m-q4_0-ggjt.bin) | [bigscience/bloom-560m](https://huggingface.co./bigscience/bloom-560m) | Q4_0   | GGJT        | V3             |
| [bloom-560m-q5_1-ggjt.bin](https://huggingface.co./rustformers/bloom-ggml/blob/main/bloom-560m-q5_1-ggjt.bin) | [bigscience/bloom-560m](https://huggingface.co./bigscience/bloom-560m) | Q5_1   | GGJT        | V3             |

## Usage

### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):

#### Installation
Via pip: `pip install llm-rs`

#### Run inference
```python
from llm_rs import AutoModel

#Load the model, define any model you like from the list above as the `model_file`
model = AutoModel.from_pretrained("rustformers/bloom-ggml",model_file="bloom-3b-q4_0-ggjt.bin")

#Generate
print(model.generate("The meaning of life is"))
```

### Rust via [Rustformers/llm](https://github.com/rustformers/llm): 

#### Installation
```
git clone --recurse-submodules https://github.com/rustformers/llm.git
cd llm
cargo build --release
```

#### Run inference
```
cargo run --release -- bloom infer -m path/to/model.bin  -p "Tell me how cool the Rust programming language is:"
```