File size: 1,356 Bytes
702940c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0532707
 
178324b
 
 
 
0532707
 
702940c
 
 
 
0532707
702940c
 
 
0532707
 
 
 
 
 
 
 
 
 
 
702940c
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
language: 
- en
thumbnail: https://avatars3.githubusercontent.com/u/32437151?s=460&u=4ec59abc8d21d5feea3dab323d23a5860e6996a4&v=4
tags:
- text-classification
- emotion
- pytorch
- en
- distil-bert
license: apache-2.0
datasets:
- emotion
metrics:
- Accuracy, F1 Score
---
# Distilbert-base-uncased-emotion

## Model description:
`Distilbert-base-uncased` finetuned on the emotion dataset using HuggingFace Trainer.
```
 learning rate 2e-5, 
 batch size 64,
 num_train_epochs=8,
```

## How to Use the model:
```python
from transformers import pipeline
classifier = pipeline("sentiment-analysis",model='bhadresh-savani/distilbert-base-uncased-emotion')
prediction = classifier("I love using transformers. The best part is wide range of support and its easy to use")
print(prediction)
```

## Dataset:
[Twitter-Sentiment-Analysis](https://huggingface.co./nlp/viewer/?dataset=emotion).

## Training procedure
[Colab Notebook](https://github.com/bhadreshpsavani/ExploringSentimentalAnalysis/blob/main/SentimentalAnalysisWithDistilbert.ipynb)

## Eval results
```
{
'test_accuracy': 0.938,
 'test_f1': 0.937932884041714,
 'test_loss': 0.1472451239824295,
 'test_mem_cpu_alloc_delta': 0,
 'test_mem_cpu_peaked_delta': 0,
 'test_mem_gpu_alloc_delta': 0,
 'test_mem_gpu_peaked_delta': 163454464,
 'test_runtime': 5.0164,
 'test_samples_per_second': 398.69
 }
```