Update README.md
Browse files
README.md
CHANGED
@@ -9,10 +9,11 @@ base_model: sentence-transformers/all-MiniLM-L6-v2
|
|
9 |
metrics:
|
10 |
- accuracy
|
11 |
widget:
|
12 |
-
- text:
|
|
|
13 |
ngày bạn mua 1 tờ vé số, vậy
|
14 |
|
15 |
-
chúng ta cần bao nhiêu ngày (trung bình) để có 98% cơ hội trúng?
|
16 |
- text: Briefly describe the concept of photosynthesis.
|
17 |
- text: What are the benefits of using cloud storage?
|
18 |
- text: Write a Python function that checks if a given number is prime.
|
@@ -32,10 +33,22 @@ model-index:
|
|
32 |
- type: accuracy
|
33 |
value: 0.25
|
34 |
name: Accuracy
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
---
|
36 |
|
37 |
# SetFit with sentence-transformers/all-MiniLM-L6-v2
|
38 |
|
|
|
|
|
|
|
|
|
|
|
|
|
39 |
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
|
40 |
|
41 |
The model has been trained using an efficient few-shot learning technique that involves:
|
|
|
9 |
metrics:
|
10 |
- accuracy
|
11 |
widget:
|
12 |
+
- text: >-
|
13 |
+
Xác suất để trúng giải thưởng khi bạn mua một tờ vé số là 0.05%. Giả sử mỗi
|
14 |
ngày bạn mua 1 tờ vé số, vậy
|
15 |
|
16 |
+
chúng ta cần bao nhiêu ngày (trung bình) để có 98% cơ hội trúng?
|
17 |
- text: Briefly describe the concept of photosynthesis.
|
18 |
- text: What are the benefits of using cloud storage?
|
19 |
- text: Write a Python function that checks if a given number is prime.
|
|
|
33 |
- type: accuracy
|
34 |
value: 0.25
|
35 |
name: Accuracy
|
36 |
+
license: mit
|
37 |
+
datasets:
|
38 |
+
- chibao24/gpt_routing
|
39 |
+
language:
|
40 |
+
- vi
|
41 |
+
- en
|
42 |
---
|
43 |
|
44 |
# SetFit with sentence-transformers/all-MiniLM-L6-v2
|
45 |
|
46 |
+
This model is gpt routing between gpt.5 and gpt-4o based on my prompt (to reduce cost). You can take a look at the dataset for more information.
|
47 |
+
I got the idea from this [LLM classifier](https://github.com/lamini-ai/llm-classifier)
|
48 |
+
|
49 |
+
The model utilizes Few-Shot Learning techniques through SetFit, requiring only 8 examples per class. It can be trained in less than 1 minute on an RTX 3060 graphics card.
|
50 |
+
This method provides an efficient solution for developing lightweight models suitable for real-world applications.
|
51 |
+
|
52 |
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
|
53 |
|
54 |
The model has been trained using an efficient few-shot learning technique that involves:
|