0xroyce commited on
Commit
7f0ee16
·
verified ·
1 Parent(s): ab2ccc8

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +100 -0
README.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ license: apache-2.0
4
+ tags:
5
+ - llama
6
+ - transformer
7
+ - 8b
8
+ - 4bit
9
+ - instruction-tuning
10
+ - conversational
11
+ pipeline_tag: text-generation
12
+ inference: false
13
+ model_creator: 0xroyce
14
+ model_type: LLaMA
15
+ ---
16
+
17
+ # 0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit
18
+
19
+ 0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit is a fine-tuned version of the LLaMA-3.1-8B model, specifically optimized for tasks related to finance, economics, trading, psychology, and social engineering. This model leverages the LLaMA architecture and employs 4-bit quantization to deliver high performance in resource-constrained environments while maintaining accuracy and relevance in natural language processing tasks.
20
+
21
+ ## Model Details
22
+
23
+ - **Model Type**: LLaMA
24
+ - **Model Size**: 8 Billion Parameters
25
+ - **Quantization**: 4-bit (bnb, bitsandbytes)
26
+ - **Architecture**: Transformer-based
27
+ - **Creator**: [0xroyce](https://huggingface.co/0xroyce)
28
+ - **License**: [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
29
+
30
+ ## Training
31
+
32
+ 0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit was fine-tuned on the [**"Financial, Economic, and Psychological Analysis Texts"** dataset](https://huggingface.co/datasets/0xroyce/Plutus), which is a comprehensive collection of 85 influential books out of a planned 398. This dataset covers key areas such as:
33
+
34
+ - **Finance and Investment**: Including stock market analysis, value investing, and exchange-traded funds (ETFs).
35
+ - **Trading Strategies**: Focused on technical analysis, options trading, and algorithmic trading methods.
36
+ - **Risk Management**: Featuring quantitative approaches to financial risk management and volatility analysis.
37
+ - **Behavioral Finance and Psychology**: Exploring the psychological aspects of trading, persuasion, and psychological operations.
38
+ - **Social Engineering and Security**: Highlighting manipulation techniques and cybersecurity threats.
39
+
40
+ As the dataset contained only 21.36% of its planned content at the time of training, this version of the model is sometimes referred to as the '21% version.' This fine-tuning process enhances the model's ability to generate coherent and contextually relevant text in domains like financial analysis, economic theory, and trading strategies. The 4-bit quantization ensures that the model can be deployed in environments with limited computational resources without compromising performance.
41
+
42
+ ## Intended Use
43
+
44
+ This model is well-suited for a variety of natural language processing tasks within the finance, economics, psychology, and cybersecurity domains, including but not limited to:
45
+
46
+ - **Financial Analysis**: Extracting insights and performing sentiment analysis on financial texts.
47
+ - **Economic Modeling**: Generating contextually relevant economic theories and market predictions.
48
+ - **Behavioral Finance Research**: Analyzing and generating text related to trading psychology and investor behavior.
49
+ - **Cybersecurity and Social Engineering**: Studying manipulation techniques and generating security-related content.
50
+
51
+ ## Performance
52
+
53
+ While specific benchmark scores for 0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit are not provided, the model is designed to offer competitive performance within its parameter range, particularly for tasks involving financial, economic, and security-related data. The 4-bit quantization offers a balance between model size and computational efficiency, making it ideal for deployment in resource-limited settings.
54
+
55
+ ## Limitations
56
+
57
+ Despite its strengths, the 0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit model has some limitations:
58
+
59
+ - **Domain-Specific Biases**: The model may generate biased content depending on the input, especially within specialized financial, psychological, or cybersecurity domains.
60
+ - **Inference Speed**: Although optimized with 4-bit quantization, real-time application latency may still be an issue depending on the deployment environment.
61
+ - **Context Length**: The model has a limited context window, which can affect its ability to process long-form documents or complex multi-turn conversations effectively.
62
+
63
+ ## How to Use
64
+
65
+ You can load and use the model with the following code:
66
+
67
+ ```python
68
+ from transformers import AutoModelForCausalLM, AutoTokenizer
69
+
70
+ tokenizer = AutoTokenizer.from_pretrained("0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit")
71
+ model = AutoModelForCausalLM.from_pretrained("0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit")
72
+
73
+ input_text = "Your text here"
74
+ input_ids = tokenizer(input_text, return_tensors="pt").input_ids
75
+
76
+ output = model.generate(input_ids, max_length=50)
77
+ print(tokenizer.decode(output[0], skip_special_tokens=True))
78
+ ```
79
+
80
+ ## Ethical Considerations
81
+
82
+ The 0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit model, like other large language models, can generate biased or potentially harmful content. Users are advised to implement content filtering and moderation when deploying this model in public-facing applications. Further fine-tuning is also encouraged to align the model with specific ethical guidelines or domain-specific requirements.
83
+
84
+ ## Citation
85
+
86
+ If you use this model in your research or applications, please cite it as follows:
87
+
88
+ ```bibtex
89
+ @misc{0xroyce2024plutus,
90
+ author = {0xroyce},
91
+ title = {Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit},
92
+ year = {2024},
93
+ publisher = {Hugging Face},
94
+ howpublished = {\\url{https://huggingface.co/0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit}},
95
+ }
96
+ ```
97
+
98
+ ## Acknowledgements
99
+
100
+ Special thanks to the open-source community and contributors who made this model possible.