Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ language:
|
|
8 |
|
9 |
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
|
11 |
-
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets.
|
12 |
|
13 |
## How to Use
|
14 |
|
@@ -25,11 +25,11 @@ from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
25 |
|
26 |
torch.set_default_device("cuda")
|
27 |
|
28 |
-
model = AutoModelForCausalLM.from_pretrained("voidful/phi-
|
29 |
-
tokenizer = AutoTokenizer.from_pretrained("voidful/phi-
|
30 |
|
31 |
from fastchat.conversation import get_conv_template
|
32 |
-
conv = get_conv_template('
|
33 |
conv.append_message(conv.roles[0], "how to make a keyboard?")
|
34 |
conv.append_message(conv.roles[1], """To make a keyboard, you will need the following materials:
|
35 |
|
@@ -50,30 +50,21 @@ Here is a basic outline of the steps involved in making a keyboard:
|
|
50 |
Note that this is just a basic outline, and there are many additional steps and considerations that will depend on the specific design and requirements of your keyboard.</s>""")
|
51 |
conv.append_message(conv.roles[0], "where to buy the circuit?")
|
52 |
prompt = conv.get_prompt()
|
53 |
-
inputs = tokenizer(prompt, return_tensors="pt", return_attention_mask=True)
|
54 |
-
|
55 |
-
from transformers import StoppingCriteria
|
56 |
-
class EosListStoppingCriteria(StoppingCriteria):
|
57 |
-
def __init__(self, eos_sequence = tokenizer.encode(".</s>[INST]")):
|
58 |
-
self.eos_sequence = eos_sequence
|
59 |
-
|
60 |
-
def __call__(self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs) -> bool:
|
61 |
-
last_ids = input_ids[:,-len(self.eos_sequence):].tolist()
|
62 |
-
return self.eos_sequence in last_ids
|
63 |
|
64 |
-
|
|
|
65 |
text = tokenizer.batch_decode(outputs[:,inputs.input_ids.shape[-1]:])[0]
|
66 |
print(text)
|
67 |
```
|
68 |
|
69 |
### Result
|
70 |
```
|
71 |
-
There are many places where you can buy a MOSFET (Metal-Oxide-Semiconductor-Fluorescent) board for your keyboard. Here are a few options:
|
72 |
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
|
|
77 |
|
78 |
-
|
79 |
```
|
|
|
8 |
|
9 |
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
|
11 |
+
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets and extended to 32k.
|
12 |
|
13 |
## How to Use
|
14 |
|
|
|
25 |
|
26 |
torch.set_default_device("cuda")
|
27 |
|
28 |
+
model = AutoModelForCausalLM.from_pretrained("voidful/phi-1_5_chat_32k",trust_remote_code=True)
|
29 |
+
tokenizer = AutoTokenizer.from_pretrained("voidful/phi-1_5_chat_32k", trust_remote_code=True,device_map="auto")
|
30 |
|
31 |
from fastchat.conversation import get_conv_template
|
32 |
+
conv = get_conv_template('qwen-7b-chat')
|
33 |
conv.append_message(conv.roles[0], "how to make a keyboard?")
|
34 |
conv.append_message(conv.roles[1], """To make a keyboard, you will need the following materials:
|
35 |
|
|
|
50 |
Note that this is just a basic outline, and there are many additional steps and considerations that will depend on the specific design and requirements of your keyboard.</s>""")
|
51 |
conv.append_message(conv.roles[0], "where to buy the circuit?")
|
52 |
prompt = conv.get_prompt()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
53 |
|
54 |
+
inputs = tokenizer(prompt, return_tensors="pt", return_attention_mask=True)
|
55 |
+
outputs = model.generate(**inputs, max_length=1024)
|
56 |
text = tokenizer.batch_decode(outputs[:,inputs.input_ids.shape[-1]:])[0]
|
57 |
print(text)
|
58 |
```
|
59 |
|
60 |
### Result
|
61 |
```
|
|
|
62 |
|
63 |
+
You can buy the circuit board from a variety of sources, including:
|
64 |
+
|
65 |
+
1. Online retailers: There are many online retailers that sell circuit boards, including Amazon, Best Buy, and Walmart.
|
66 |
+
2. Electronics stores: Many electronics stores, such as Best Buy, Walmart, and Target, carry circuit boards.
|
67 |
+
3. Specialty stores: There are also specialty stores that sell circuit boards, such as Circuit City, Best Buy, and Walmart.
|
68 |
|
69 |
+
When buying a circuit board, it is important to consider factors such as the size and type of circuit board, the quality and reliability of the circuit board, and the cost and availability of the circuit board.<|im_end|>
|
70 |
```
|