Edit model card

albert-small-kor-cross-encoder-v1

  • albert-small-kor-v1 ๋ชจ๋ธ์„ ํ›ˆ๋ จ์‹œ์ผœ cross-encoder๋กœ ํŒŒ์ธํŠœ๋‹ํ•œ ๋ชจ๋ธ
  • This model was trained using SentenceTransformers Cross-Encoder class.

Training

  • sts(10)-nli(3)-sts(10)-nli(3)-sts(10) ํ›ˆ๋ จ ์‹œํ‚ด (distil ํ›ˆ๋ จ ์—†์Œ)

  • STS : seed=111,epoch=10, lr=1e-4, eps=1e-6, warm_step=10%, max_seq_len=128, train_batch=128(small ๋ชจ๋ธ=32) (albert 13m/7G) ํ›ˆ๋ จ์ฝ”๋“œ

  • NLI ํ›ˆ๋ จ : seed=111,epoch=3, lr=3e-5, eps=1e-8, warm_step=10%, max_seq_len=128, train_batch=64, eval_bath=64(albert 2h/7G) ํ›ˆ๋ จ์ฝ”๋“œ

  • ํ‰๊ฐ€์ฝ”๋“œ,ํ…Œ์ŠคํŠธ์ฝ”๋“œ

  • ๋ชจ๋ธ korsts klue-sts glue(stsb) stsb_multi_mt(en)
    albert-small-kor-cross-encoder-v1 0.8455 0.8526 0.8513 0.7976
    klue-cross-encoder-v1 0.8262 0.8833 0.8512 0.7889
    kpf-cross-encoder-v1 0.8799 0.9133 0.8626 0.8027

Usage and Performance

Pre-trained models can be used like this:

from sentence_transformers import CrossEncoder
model = CrossEncoder('bongsoo/albert-small-kor-cross-encoder-v1')
scores = model.predict([('์˜ค๋Š˜ ๋‚ ์”จ๊ฐ€ ์ข‹๋‹ค', '์˜ค๋Š˜ ๋“ฑ์‚ฐ์„ ํ•œ๋‹ค'), ('์˜ค๋Š˜ ๋‚ ์”จ๊ฐ€ ํ๋ฆฌ๋‹ค', '์˜ค๋Š˜ ๋น„๊ฐ€ ๋‚ด๋ฆฐ๋‹ค')])
print(scores)
[0.45417202 0.6294121 ]

The model will predict scores for the pairs ('Sentence 1', 'Sentence 2') and ('Sentence 3', 'Sentence 4').

You can use this model also without sentence_transformers and by just using Transformers AutoModel class

Downloads last month
596
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.