YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co./docs/hub/model-cards#model-card-metadata)

Dense Encoder - Distilbert - Frozen Token Embeddings

This model is a distilbert-base-uncased model trained for 30 epochs (235k steps), 64 batch size with MarginMSE Loss on MS MARCO dataset.

The token embeddings were frozen.

Dataset Model with updated token embeddings Model with frozen embeddings
TREC-DL 19 70.68 68.60
TREC-DL 20 67.69 70.21
FiQA 28.89 28.60
Robust04 39.56 39.08
TREC-COVID v2 69.80 69.84
TREC-NEWS 37.97 38.27
Avg. 4 BEIR tasks 44.06 43.95
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.