Huertas97 commited on
Commit
e102e15
1 Parent(s): 5b95be3

README header table results

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -90,8 +90,9 @@ Here we compare the average multilingual semantic textual similairty capabilitie
90
  | mstsb-paraphrase-multilingual-mpnet-base-v2 | 0.835890 |
91
  | paraphrase-multilingual-mpnet-base-v2 | 0.818896 |
92
 
 
93
 
94
- For the sake of readability tasks have been splitted into monolingual and cross-lingual tasks.
95
 
96
  | Monolingual Task | Pearson Cosine test | Spearman Cosine test |
97
  |------------------|---------------------|-----------------------|
@@ -112,7 +113,7 @@ For the sake of readability tasks have been splitted into monolingual and cross-
112
  | zh-CN;zh-CN | 0.826233678946644 | 0.8248515460782744 |
113
  | zh-TW;zh-TW | 0.8242683809675422 | 0.8235506799952028 |
114
 
115
- \\
116
 
117
  | Cross-lingual Task | Pearson Cosine test | Spearman Cosine test |
118
  |--------------------|---------------------|-----------------------|
 
90
  | mstsb-paraphrase-multilingual-mpnet-base-v2 | 0.835890 |
91
  | paraphrase-multilingual-mpnet-base-v2 | 0.818896 |
92
 
93
+ \
94
 
95
+ The following tables breakdown the performance of `mstsb-paraphrase-multilingual-mpnet-base-v2` according to the different tasks. For the sake of readability tasks have been splitted into monolingual and cross-lingual tasks.
96
 
97
  | Monolingual Task | Pearson Cosine test | Spearman Cosine test |
98
  |------------------|---------------------|-----------------------|
 
113
  | zh-CN;zh-CN | 0.826233678946644 | 0.8248515460782744 |
114
  | zh-TW;zh-TW | 0.8242683809675422 | 0.8235506799952028 |
115
 
116
+ \
117
 
118
  | Cross-lingual Task | Pearson Cosine test | Spearman Cosine test |
119
  |--------------------|---------------------|-----------------------|