Sentence Transformers - Cross-Encoders

university
Activity Feed

AI & ML interests

This repository hosts the cross-encoders from the SentenceTransformers package. More details on https://www.sbert.net/docs/pretrained_cross-encoders.html

Recent Activity

cross-encoder's activity

tomaarsenĀ 
posted an update 12 days ago
view post
Post
2627
That didn't take long! Nomic AI has finetuned the new ModernBERT-base encoder model into a strong embedding model for search, classification, clustering and more!

Details:
šŸ¤– Based on ModernBERT-base with 149M parameters.
šŸ“Š Outperforms both nomic-embed-text-v1 and nomic-embed-text-v1.5 on MTEB!
šŸŽļø Immediate FA2 and unpacking support for super efficient inference.
šŸŖ† Trained with Matryoshka support, i.e. 2 valid output dimensionalities: 768 and 256.
āž”ļø Maximum sequence length of 8192 tokens!
2ļøāƒ£ Trained in 2 stages: unsupervised contrastive data -> high quality labeled datasets.
āž• Integrated in Sentence Transformers, Transformers, LangChain, LlamaIndex, Haystack, etc.
šŸ›ļø Apache 2.0 licensed: fully commercially permissible

Try it out here: nomic-ai/modernbert-embed-base

Very nice work by Zach Nussbaum and colleagues at Nomic AI.