Generated README.md
Browse files
README.md
CHANGED
@@ -27,26 +27,32 @@ similar and same-sized models, such as those in the OPT and GPT-Neo suites.
|
|
27 |
|
28 |
## Converted Models:
|
29 |
|
30 |
-
| Name
|
31 |
-
|
32 |
-
| [pythia-
|
33 |
-
| [pythia-
|
34 |
-
| [pythia-
|
35 |
-
| [pythia-
|
36 |
-
| [pythia-160m-
|
37 |
-
| [pythia-160m-q4_0.bin](https://huggingface.co/
|
38 |
-
| [pythia-
|
39 |
-
| [pythia-
|
40 |
-
| [pythia-
|
41 |
-
| [pythia-1b-
|
42 |
-
| [pythia-1b-q4_0-ggjt.bin](https://huggingface.co/
|
43 |
-
| [pythia-1b-
|
44 |
-
| [pythia-
|
45 |
-
| [pythia-
|
46 |
-
| [pythia-
|
47 |
-
| [pythia-2.8b-
|
48 |
-
| [pythia-
|
49 |
-
| [pythia-
|
|
|
|
|
|
|
|
|
|
|
|
|
50 |
|
51 |
## Usage
|
52 |
|
@@ -60,7 +66,7 @@ Via pip: `pip install llm-rs`
|
|
60 |
from llm_rs import AutoModel
|
61 |
|
62 |
#Load the model, define any model you like from the list above as the `model_file`
|
63 |
-
model = AutoModel.from_pretrained("
|
64 |
|
65 |
#Generate
|
66 |
print(model.generate("The meaning of life is"))
|
@@ -70,11 +76,12 @@ print(model.generate("The meaning of life is"))
|
|
70 |
|
71 |
#### Installation
|
72 |
```
|
73 |
-
git clone --recurse-submodules
|
|
|
74 |
cargo build --release
|
75 |
```
|
76 |
|
77 |
#### Run inference
|
78 |
```
|
79 |
cargo run --release -- gptneox infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:"
|
80 |
-
```
|
|
|
27 |
|
28 |
## Converted Models:
|
29 |
|
30 |
+
| Name | Based on | Type | Container | GGML Version |
|
31 |
+
|:----------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------|:-------|:------------|:---------------|
|
32 |
+
| [pythia-1.4b-f16.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1.4b-f16.bin) | [EleutherAI/pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.4b) | F16 | GGML | V3 |
|
33 |
+
| [pythia-1.4b-q4_0.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1.4b-q4_0.bin) | [EleutherAI/pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.4b) | Q4_0 | GGML | V3 |
|
34 |
+
| [pythia-1.4b-q4_0-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1.4b-q4_0-ggjt.bin) | [EleutherAI/pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.4b) | Q4_0 | GGJT | V3 |
|
35 |
+
| [pythia-1.4b-q5_1-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1.4b-q5_1-ggjt.bin) | [EleutherAI/pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.4b) | Q5_1 | GGJT | V3 |
|
36 |
+
| [pythia-160m-f16.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-160m-f16.bin) | [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) | F16 | GGML | V3 |
|
37 |
+
| [pythia-160m-q4_0.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-160m-q4_0.bin) | [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) | Q4_0 | GGML | V3 |
|
38 |
+
| [pythia-160m-q4_0-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-160m-q4_0-ggjt.bin) | [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) | Q4_0 | GGJT | V3 |
|
39 |
+
| [pythia-160m-q5_1-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-160m-q5_1-ggjt.bin) | [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) | Q5_1 | GGJT | V3 |
|
40 |
+
| [pythia-1b-f16.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1b-f16.bin) | [EleutherAI/pythia-1b](https://huggingface.co/EleutherAI/pythia-1b) | F16 | GGML | V3 |
|
41 |
+
| [pythia-1b-q4_0.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1b-q4_0.bin) | [EleutherAI/pythia-1b](https://huggingface.co/EleutherAI/pythia-1b) | Q4_0 | GGML | V3 |
|
42 |
+
| [pythia-1b-q4_0-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1b-q4_0-ggjt.bin) | [EleutherAI/pythia-1b](https://huggingface.co/EleutherAI/pythia-1b) | Q4_0 | GGJT | V3 |
|
43 |
+
| [pythia-1b-q5_1-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-1b-q5_1-ggjt.bin) | [EleutherAI/pythia-1b](https://huggingface.co/EleutherAI/pythia-1b) | Q5_1 | GGJT | V3 |
|
44 |
+
| [pythia-2.8b-f16.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-2.8b-f16.bin) | [EleutherAI/pythia-2.8b](https://huggingface.co/EleutherAI/pythia-2.8b) | F16 | GGML | V3 |
|
45 |
+
| [pythia-2.8b-q4_0.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-2.8b-q4_0.bin) | [EleutherAI/pythia-2.8b](https://huggingface.co/EleutherAI/pythia-2.8b) | Q4_0 | GGML | V3 |
|
46 |
+
| [pythia-2.8b-q4_0-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-2.8b-q4_0-ggjt.bin) | [EleutherAI/pythia-2.8b](https://huggingface.co/EleutherAI/pythia-2.8b) | Q4_0 | GGJT | V3 |
|
47 |
+
| [pythia-2.8b-q5_1-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-2.8b-q5_1-ggjt.bin) | [EleutherAI/pythia-2.8b](https://huggingface.co/EleutherAI/pythia-2.8b) | Q5_1 | GGJT | V3 |
|
48 |
+
| [pythia-410m-f16.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-410m-f16.bin) | [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m) | F16 | GGML | V3 |
|
49 |
+
| [pythia-410m-q4_0.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-410m-q4_0.bin) | [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m) | Q4_0 | GGML | V3 |
|
50 |
+
| [pythia-410m-q4_0-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-410m-q4_0-ggjt.bin) | [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m) | Q4_0 | GGJT | V3 |
|
51 |
+
| [pythia-410m-q5_1-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-410m-q5_1-ggjt.bin) | [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m) | Q5_1 | GGJT | V3 |
|
52 |
+
| [pythia-70m-f16.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-70m-f16.bin) | [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) | F16 | GGML | V3 |
|
53 |
+
| [pythia-70m-q4_0.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-70m-q4_0.bin) | [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) | Q4_0 | GGML | V3 |
|
54 |
+
| [pythia-70m-q4_0-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-70m-q4_0-ggjt.bin) | [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) | Q4_0 | GGJT | V3 |
|
55 |
+
| [pythia-70m-q5_1-ggjt.bin](https://huggingface.co/rustformers/pythia-ggml/blob/main/pythia-70m-q5_1-ggjt.bin) | [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) | Q5_1 | GGJT | V3 |
|
56 |
|
57 |
## Usage
|
58 |
|
|
|
66 |
from llm_rs import AutoModel
|
67 |
|
68 |
#Load the model, define any model you like from the list above as the `model_file`
|
69 |
+
model = AutoModel.from_pretrained("rustformers/pythia-ggml",model_file="pythia-70m-q4_0-ggjt.bin")
|
70 |
|
71 |
#Generate
|
72 |
print(model.generate("The meaning of life is"))
|
|
|
76 |
|
77 |
#### Installation
|
78 |
```
|
79 |
+
git clone --recurse-submodules https://github.com/rustformers/llm.git
|
80 |
+
cd llm
|
81 |
cargo build --release
|
82 |
```
|
83 |
|
84 |
#### Run inference
|
85 |
```
|
86 |
cargo run --release -- gptneox infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:"
|
87 |
+
```
|