ikrysinska
commited on
Commit
•
5745e4b
1
Parent(s):
06a229c
Update README.md
Browse files
README.md
CHANGED
@@ -90,6 +90,7 @@ encoded_input = tokenizer(tweet, conspiracy_theory, return_tensors="pt")
|
|
90 |
logits = model(encoded_input.input_ids, encoded_input.attention_mask).logits
|
91 |
support_likelihood = logits.softmax(dim=1)[0].tolist()[0] # 0.93198
|
92 |
```
|
|
|
93 |
|
94 |
## Training Details
|
95 |
|
@@ -99,7 +100,11 @@ The model was finetuned with [webimmunization/COVID-19-CT-tweets-classification
|
|
99 |
|
100 |
### Training Procedure
|
101 |
|
102 |
-
The adapter was trained for 5 epochs with a batch size of 16.
|
|
|
|
|
|
|
|
|
103 |
|
104 |
#### Preprocessing
|
105 |
|
|
|
90 |
logits = model(encoded_input.input_ids, encoded_input.attention_mask).logits
|
91 |
support_likelihood = logits.softmax(dim=1)[0].tolist()[0] # 0.93198
|
92 |
```
|
93 |
+
Loading the model shouldn't take more than 10 minutes depending on the Internet connection.
|
94 |
|
95 |
## Training Details
|
96 |
|
|
|
100 |
|
101 |
### Training Procedure
|
102 |
|
103 |
+
The adapter was trained for 5 epochs with a batch size of 16.
|
104 |
+
|
105 |
+
### System requirements
|
106 |
+
|
107 |
+
We used Python 3.10, PyTorch 2.0.1, and transformers 4.27.0.
|
108 |
|
109 |
#### Preprocessing
|
110 |
|