FINGU-AI commited on
Commit
42ad845
·
verified ·
1 Parent(s): 6d59078

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -0
README.md ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+ # FINGU-AI/Q-Small-3B
5
+
6
+ ## Overview
7
+ `FINGU-AI/Q-Small-3B` is a powerful causal language model designed for a variety of natural language processing (NLP) tasks, including machine translation, text generation, and chat-based applications. This model is particularly useful for translating between languages, as well as supporting other custom NLP tasks through flexible input.
8
+
9
+ ## Example Usage
10
+
11
+ ### Installation
12
+ Make sure to install the required packages:
13
+
14
+ ```bash
15
+ pip install torch transformers
16
+ ```
17
+ ### Loading the Model
18
+
19
+ ```python
20
+ from transformers import AutoTokenizer, AutoModelForCausalLM
21
+ import torch
22
+
23
+ # Model and Tokenizer
24
+ model_id = 'FINGU-AI/Q-Small-3B'
25
+ model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation="sdpa", torch_dtype=torch.bfloat16)
26
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
27
+ model.to('cuda')
28
+
29
+ # Input Messages for Translation
30
+ messages = [
31
+ {"role": "system", "content": "you are helpfull assistant."},
32
+ {"role": "user", "content": """what is large language model?"""},
33
+ ]
34
+
35
+ # Tokenize and Generate Response
36
+ input_ids = tokenizer.apply_chat_template(
37
+ messages,
38
+ add_generation_prompt=True,
39
+ return_tensors="pt"
40
+ ).to('cuda')
41
+
42
+ outputs = model.generate(
43
+ input_ids,
44
+ max_new_tokens=500,
45
+ do_sample=True,
46
+ )
47
+
48
+ # Decode and Print the Translation
49
+ response = outputs[0][input_ids.shape[-1]:]
50
+ print(tokenizer.decode(response, skip_special_tokens=True))
51
+ ```