sultan-hassan
commited on
Commit
•
53443e2
1
Parent(s):
9cbf151
Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,29 @@ Hey I am CosmoGemma 👋 I can answer cosmology questions from astroph.CO resear
|
|
8 |
This is a Gemma_2b_en fine-tuned on QA pairs (3.5k) generated from Cosmology and Nongalactic Astrophysics articles (arXiv astro-ph.CO)
|
9 |
from 2018-2022 and tested on QA pairs (1k) generated from 2023 articles, scoring over 75% accuracy.
|
10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
This is a [`Gemma` model](https://keras.io/api/keras_nlp/models/gemma) uploaded using the KerasNLP library and can be used with JAX, TensorFlow, and PyTorch backends.
|
12 |
This model is related to a `CausalLM` task.
|
13 |
|
|
|
8 |
This is a Gemma_2b_en fine-tuned on QA pairs (3.5k) generated from Cosmology and Nongalactic Astrophysics articles (arXiv astro-ph.CO)
|
9 |
from 2018-2022 and tested on QA pairs (1k) generated from 2023 articles, scoring over 75% accuracy.
|
10 |
|
11 |
+
|
12 |
+
To generate an answer for a given question using this model, please use:
|
13 |
+
|
14 |
+
import keras
|
15 |
+
import keras_nlp
|
16 |
+
|
17 |
+
gemma_lm = keras_nlp.models.CausalLM.from_preset("hf://sultan-hassan/CosmoGemma_2b_en")
|
18 |
+
template = "Instruction:\n{instruction}\n\nResponse:\n{response}"
|
19 |
+
|
20 |
+
Question = "write your question here"
|
21 |
+
|
22 |
+
prompt = template.format(
|
23 |
+
instruction=Question,
|
24 |
+
response="",
|
25 |
+
)
|
26 |
+
out = gemma_lm.generate(prompt, max_length=1024)
|
27 |
+
ind = out.index('Response') + len('Response')+2
|
28 |
+
print ("Question:", Question)
|
29 |
+
print ("Answer:", out[ind:])
|
30 |
+
|
31 |
+
|
32 |
+
|
33 |
+
|
34 |
This is a [`Gemma` model](https://keras.io/api/keras_nlp/models/gemma) uploaded using the KerasNLP library and can be used with JAX, TensorFlow, and PyTorch backends.
|
35 |
This model is related to a `CausalLM` task.
|
36 |
|