VishnuPJ commited on
Commit
b570717
1 Parent(s): 715074a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -12,6 +12,7 @@ pipeline_tag: text-generation
12
  # **MalayaLLM_7B_Base**
13
 
14
  This is an attempt to construct a Language Model (LLM) focused on **generative AI for Malayalam language**. While several LLMs are proficient in supporting multiple languages, including Malayalam, enhancing their performance for specific tasks such as content generation and question answering specifically in Malayalam can be achieved through dedicated training on a Malayalam dataset. In pursuit of this, I've undertaken the **continuous pre-training of the LLAMA2 model using a comprehensive Malayalam dataset**.
 
15
  The model is currently in its early stages, and ongoing training and fine-tuning with a more comprehensive dataset are necessary to enhance its performance. I will consistently provide updated revisions to the model.
16
  # **Github Repo**:
17
  For comprehensive insights into model training, fine-tuning, and other advanced techniques, refer to the MalayaLLM GitHub repository at the following link:
@@ -43,4 +44,6 @@ The MalayaLLM models have been improved and customized to incorporate a comprehe
43
  | Model | Format | Bits | Download Links |
44
  |--------------------------|--------|----------------------|------------------------------------------------------------------------------|
45
  | Malayalam LLaMA 7B Instruct #v0.1 | GGUF | Q8_0 | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.1_GGUF) |
46
- | Malayalam LLaMA 7B Instruct #v0.2 | GGUF | Q8_0 | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.2_GGUF) |
 
 
 
12
  # **MalayaLLM_7B_Base**
13
 
14
  This is an attempt to construct a Language Model (LLM) focused on **generative AI for Malayalam language**. While several LLMs are proficient in supporting multiple languages, including Malayalam, enhancing their performance for specific tasks such as content generation and question answering specifically in Malayalam can be achieved through dedicated training on a Malayalam dataset. In pursuit of this, I've undertaken the **continuous pre-training of the LLAMA2 model using a comprehensive Malayalam dataset**.
15
+
16
  The model is currently in its early stages, and ongoing training and fine-tuning with a more comprehensive dataset are necessary to enhance its performance. I will consistently provide updated revisions to the model.
17
  # **Github Repo**:
18
  For comprehensive insights into model training, fine-tuning, and other advanced techniques, refer to the MalayaLLM GitHub repository at the following link:
 
44
  | Model | Format | Bits | Download Links |
45
  |--------------------------|--------|----------------------|------------------------------------------------------------------------------|
46
  | Malayalam LLaMA 7B Instruct #v0.1 | GGUF | Q8_0 | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.1_GGUF) |
47
+ | Malayalam LLaMA 7B Instruct #v0.2 | GGUF | Q8_0 | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.2_GGUF) |
48
+
49
+ # 🌟Happy coding💻🌟