MarinaPlius commited on
Commit
e46f1d4
1 Parent(s): 376fbaa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -16,7 +16,7 @@ tags:
16
  # Salamandra 7B aligned EADOP Model Card
17
  Salamandra 7B aligned EADOP is a full-finetuning version of
18
  [BSC Language Technologies Unit](https://huggingface.co/BSC-LT)'s
19
- [Salamndra Instruct 7B](https://huggingface.co/BSC-LT/salamandra-7b-instruct)
20
  model by the at the Barcelona Supercomputing Center focused on improving
21
  the handling of out-of-domain Questions in a RAG instruction-following setting.
22
 
@@ -36,7 +36,7 @@ capability to politely and informatively refuse to answer questions that are out
36
  ---
37
 
38
  ## Model Details
39
- Please refer to the [Salamndra Instruct 7B model details](https://huggingface.co/BSC-LT/salamandra-7b-instruct#model-details)
40
  for the specific details about the model architecture and pretraining.
41
 
42
  ## Intended Use
@@ -55,9 +55,9 @@ from transformers import AutoTokenizer, AutoModelForCausalLM
55
  import transformers
56
  import torch
57
 
58
- model_id = "BSC-LT/salamandra-7b-instruct"
59
 
60
- text = "At what temperature does water boil?"
61
 
62
  tokenizer = AutoTokenizer.from_pretrained(model_id)
63
  model = AutoModelForCausalLM.from_pretrained(
 
16
  # Salamandra 7B aligned EADOP Model Card
17
  Salamandra 7B aligned EADOP is a full-finetuning version of
18
  [BSC Language Technologies Unit](https://huggingface.co/BSC-LT)'s
19
+ [Salamandra Instruct 7B](https://huggingface.co/BSC-LT/salamandra-7b-instruct)
20
  model by the at the Barcelona Supercomputing Center focused on improving
21
  the handling of out-of-domain Questions in a RAG instruction-following setting.
22
 
 
36
  ---
37
 
38
  ## Model Details
39
+ Please refer to the [Salamandra Instruct 7B model details](https://huggingface.co/BSC-LT/salamandra-7b-instruct#model-details)
40
  for the specific details about the model architecture and pretraining.
41
 
42
  ## Intended Use
 
55
  import transformers
56
  import torch
57
 
58
+ model_id = "projecte-aina/salamandra-7b-aligned-EADOP"
59
 
60
+ text = "Quina és la finalitat del Servei Meterològic de Catalunya ?"
61
 
62
  tokenizer = AutoTokenizer.from_pretrained(model_id)
63
  model = AutoModelForCausalLM.from_pretrained(