Update README.md
Browse files
README.md
CHANGED
@@ -42,7 +42,7 @@ It was trained on 15K Tweets that mentioned at least one of 699 brands. The Twee
|
|
42 |
Because this is a multi-label classification problem, we use binary cross-entropy (BCE) with logits loss for the fine-tuning. We basically combine a sigmoid layer with BCELoss in a single class.
|
43 |
To obtain the probabilities for each label (i.e., marketing mix variable), you need to "push" the predictions through a sigmoid function. This is already done in the accompanying python notebook.
|
44 |
|
45 |
-
IMPORTANT
|
46 |
|
47 |
### Working Paper
|
48 |
Download the working paper from SSRN: ["Creating Synthetic Experts with Generative AI"](https://papers.ssrn.com/abstract_id=4542949)
|
|
|
42 |
Because this is a multi-label classification problem, we use binary cross-entropy (BCE) with logits loss for the fine-tuning. We basically combine a sigmoid layer with BCELoss in a single class.
|
43 |
To obtain the probabilities for each label (i.e., marketing mix variable), you need to "push" the predictions through a sigmoid function. This is already done in the accompanying python notebook.
|
44 |
|
45 |
+
***IMPORTANT*** At the time of writing this description, Huggingface's pipeline did not support multi-label classifiers.
|
46 |
|
47 |
### Working Paper
|
48 |
Download the working paper from SSRN: ["Creating Synthetic Experts with Generative AI"](https://papers.ssrn.com/abstract_id=4542949)
|