Using hard negatives VS query, pos pair to train embedding models

#2
by rasyosef - opened

Does using query, positive, negative triplets to train embedding models lead to better performance compared to just using query, positive pairs and a MultipleNegativesRankingLoss? If so, how significant is the improvement?

Hello!

Yes, using query, positive, negative triplets generally improves performance over just query, positive tuples in MultipleNegativesRankingLoss. The idea is that if you include a negative too, then it is more difficult for the model to find the correct positive (i.e. the "answer") for the given query out of all other positive and negative values from the batch that are used as all potential answers. The more difficult (to a point), the stronger the model will become.
Overall, the relative performance likely increases by about 1-4%, so it's not very major, but the best models do use "hard negatives", i.e. negatives that have been mined like this to act as difficult potential answers.

  • Tom Aarsen

Sign up or log in to comment