gabrielmotablima
commited on
Commit
•
3309256
1
Parent(s):
396189d
update readme
Browse files
README.md
CHANGED
@@ -63,7 +63,7 @@ print(generated_text)
|
|
63 |
The evaluation metrics Cider-D, BLEU@4, ROUGE-L, METEOR and BERTScore are abbreviated as C, B@4, RL, M and BS, respectively.
|
64 |
|
65 |
|Model|Training|Evaluation|C|B@4|RL|M|BS|
|
66 |
-
|
67 |
|Swin-DistilBERTimbau|Flickr30K Portuguese|Flickr30K Portuguese|66.73|24.65|39.98|44.71|72.30|
|
68 |
|Swin-GPorTuguese|Flickr30K Portuguese|Flickr30K Portuguese|64.71|23.15|39.39|44.36|71.70|
|
69 |
|
|
|
63 |
The evaluation metrics Cider-D, BLEU@4, ROUGE-L, METEOR and BERTScore are abbreviated as C, B@4, RL, M and BS, respectively.
|
64 |
|
65 |
|Model|Training|Evaluation|C|B@4|RL|M|BS|
|
66 |
+
|-----|-------:|---------:|------:|-----:|------:|-----:|--------:|
|
67 |
|Swin-DistilBERTimbau|Flickr30K Portuguese|Flickr30K Portuguese|66.73|24.65|39.98|44.71|72.30|
|
68 |
|Swin-GPorTuguese|Flickr30K Portuguese|Flickr30K Portuguese|64.71|23.15|39.39|44.36|71.70|
|
69 |
|