Ngit commited on
Commit
2e2ce7e
1 Parent(s): 4c401bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -6,14 +6,14 @@ language:
6
  # Text Classification GoEmotions
7
 
8
  This model is a fined-tuned version of [nreimers/MiniLMv2-L6-H384-distilled-from-BERT-Large](https://huggingface.co/nreimers/MiniLMv2-L6-H384-distilled-from-BERT-Large) on the on the [Jigsaw 1st Kaggle competition](https://www.kaggle.com/competitions/jigsaw-toxic-comment-classification-challenge) dataset using [unitary/toxic-bert](https://huggingface.co/unitary/toxic-bert) as teacher model.
9
- The quantized version in ONNX format can be found [here](minuva/MiniLMv2-toxic-jijgsaw-onnx). The model with two labels only (toxicity and severe toxicity) is [here](minuva/MiniLMv2-toxic-jijgsaw-lite-onnx)
10
 
11
  # Load the Model
12
 
13
  ```py
14
  from transformers import pipeline
15
 
16
- pipe = pipeline(model='minuva/MiniLMv2-toxic-jijgsaw', task='text-classification')
17
  pipe("This is pure trash")
18
  # [{'label': 'toxic', 'score': 0.9383478164672852}]
19
  ```
@@ -34,7 +34,7 @@ The following hyperparameters were used during training:
34
 
35
  | Teacher (params) | Student (params) | Set (metric) | Score (teacher) | Score (student) |
36
  |--------------------|-------------|----------|--------| --------|
37
- | unitary/toxic-bert (110M) | MiniLMv2-toxic-jijgsaw (23M) | Test (ROC_AUC) | 0.98636 | 0.98600 |
38
 
39
  # Deployment
40
 
 
6
  # Text Classification GoEmotions
7
 
8
  This model is a fined-tuned version of [nreimers/MiniLMv2-L6-H384-distilled-from-BERT-Large](https://huggingface.co/nreimers/MiniLMv2-L6-H384-distilled-from-BERT-Large) on the on the [Jigsaw 1st Kaggle competition](https://www.kaggle.com/competitions/jigsaw-toxic-comment-classification-challenge) dataset using [unitary/toxic-bert](https://huggingface.co/unitary/toxic-bert) as teacher model.
9
+ The quantized version in ONNX format can be found [here](minuva/MiniLMv2-toxic-jigsaw-onnx). The model with two labels only (toxicity and severe toxicity) is [here](minuva/MiniLMv2-toxic-jigsaw-lite-onnx)
10
 
11
  # Load the Model
12
 
13
  ```py
14
  from transformers import pipeline
15
 
16
+ pipe = pipeline(model='minuva/MiniLMv2-toxic-jigsaw', task='text-classification')
17
  pipe("This is pure trash")
18
  # [{'label': 'toxic', 'score': 0.9383478164672852}]
19
  ```
 
34
 
35
  | Teacher (params) | Student (params) | Set (metric) | Score (teacher) | Score (student) |
36
  |--------------------|-------------|----------|--------| --------|
37
+ | unitary/toxic-bert (110M) | MiniLMv2-toxic-jigsaw (23M) | Test (ROC_AUC) | 0.98636 | 0.98600 |
38
 
39
  # Deployment
40