erfanzar commited on
Commit
7614311
1 Parent(s): d340a2d

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +48 -0
README.md ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - erfanzar/UltraChat-Mixin
5
+ language:
6
+ - en
7
+ pipeline_tag: text-generation
8
+ tags:
9
+ - code
10
+ ---
11
+
12
+
13
+ # LinguaMatic
14
+
15
+ LinguaMatic is an advanced AI model designed to handle a wide range of Natural Language Processing (NLP) tasks. With its powerful capabilities, LinguaMatic can assist with tasks such as text classification, sentiment analysis, language translation, question answering, and much more.
16
+ ## Last Update
17
+
18
+ * Nov 19 - Now AI works better with system prompting and you can achive better result with giving model better system prompts
19
+ * Nov 20 - Fixing Questioning responses
20
+
21
+ ## EasyDel
22
+
23
+ The model is finetuned Using a custom version of UltraChat on TPU-v4 POD using [EasyDel](https://github.com/erfanzar/EasyDeL)
24
+
25
+ ## Prompting Method
26
+
27
+ LinguaMatic utilizes the llama2 prompting method to generate responses. This method, named after the friendly and intelligent llama, enhances the model's ability to engage in meaningful conversations. The `prompt_model` function provided below demonstrates how the llama2 prompting method is implemented:
28
+
29
+ ```python
30
+ def prompt_model(message: str, chat_history,
31
+ system_prompt: str) -> str:
32
+ do_strip = False
33
+ texts = [f'<s>[INST] <<SYS>>\n{system_prompt}\n<</SYS>>\n\n']
34
+ for user_input, response in chat_history:
35
+ user_input = user_input.strip() if do_strip else user_input
36
+ do_strip = True
37
+ texts.append(f'{user_input} [/INST] {response.strip()} </s><s>[INST] ')
38
+ message = message.strip() if do_strip else message
39
+ texts.append(f'{message} [/INST]')
40
+ return ''.join(texts)
41
+ ```
42
+
43
+ The `prompt_model` function takes a `message` as input, along with the `chat_history` and `system_prompt`. It generates a formatted text that includes the system prompt, user inputs, and the current message. This approach allows LinguaMatic to maintain context and provide more coherent and context-aware responses.
44
+
45
+
46
+ ## Contributing
47
+
48
+ We welcome contributions to enhance LinguaMatic's capabilities and improve its performance. If you encounter any issues or have suggestions for improvement, please feel free to submit a pull request or open an issue on [EasyDel](https://github.com/erfanzar/EasyDeL) GitHub repository.