SetFit with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
neither
  • 'i asked brand to write it and then let it translate back. so in reality i have no clue what i am sending...'
  • "i saw someone summarize brand the other day; it doesn't give answers, it gives answer-shaped responses."
  • 'thank you comrade i mean colleague. i will have brand summarize.'
peak
  • 'brand!! it helped me finish my resume. i just asked it if it could write my resume based on horribly written descriptions i came up with. and it made it all pretty:)'
  • 'been building products for a bit now and your product (audio pen) is simple, useful and just works (like the early magic when product came out). congratulations and keep the flag flying high. not surprised that india is producing apps like yours. high time:-)'
  • 'just got access to personalization in brand!! totally unexpected. very happy'
pit
  • 'brand recently i came across a very unwell patient in a psychiatric unit who was using product & this was reinforcing his delusional state & detrimentally impacting his mental health. anyone looking into this type of usage of product? what safe guards are being put in place?'
  • 'brand product is def better at extracting numbers from images, product failed (pro version) twice...'
  • "the stuff brand gives is entirely too scripted and impractical, which is what i'm trying to avoid:/"

Evaluation

Metrics

Label Accuracy F1 Precision Recall
all 0.964 [0.8837209302325582, 0.9130434782608696, 0.9781021897810218] [1.0, 1.0, 0.9571428571428572] [0.7916666666666666, 0.84, 1.0]

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("jamiehudson/725_model_v5")
# Run inference
preds = model("product the way it shows the sources is so fucking cool, this new ai is amazing")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 3 31.6606 98
Label Training Sample Count
pit 277
peak 265
neither 1105

Training Hyperparameters

  • batch_size: (32, 32)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0000 1 0.3157 -
0.0012 50 0.2756 -
0.0023 100 0.2613 -
0.0035 150 0.278 -
0.0047 200 0.2617 -
0.0058 250 0.214 -
0.0070 300 0.2192 -
0.0082 350 0.1914 -
0.0093 400 0.1246 -
0.0105 450 0.1343 -
0.0117 500 0.0937 -
0.0129 550 0.075 -
0.0140 600 0.0479 -
0.0152 650 0.0976 -
0.0164 700 0.0505 -
0.0175 750 0.0149 -
0.0187 800 0.0227 -
0.0199 850 0.0276 -
0.0210 900 0.0033 -
0.0222 950 0.0015 -
0.0234 1000 0.0008 -
0.0245 1050 0.0005 -
0.0257 1100 0.001 -
0.0269 1150 0.0009 -
0.0280 1200 0.0004 -
0.0292 1250 0.0007 -
0.0304 1300 0.001 -
0.0315 1350 0.0004 -
0.0327 1400 0.0005 -
0.0339 1450 0.0003 -
0.0350 1500 0.0004 -
0.0362 1550 0.0002 -
0.0374 1600 0.0004 -
0.0386 1650 0.0003 -
0.0397 1700 0.0003 -
0.0409 1750 0.0005 -
0.0421 1800 0.0004 -
0.0432 1850 0.0003 -
0.0444 1900 0.0002 -
0.0456 1950 0.0002 -
0.0467 2000 0.0003 -
0.0479 2050 0.0002 -
0.0491 2100 0.0001 -
0.0502 2150 0.0002 -
0.0514 2200 0.0256 -
0.0526 2250 0.0001 -
0.0537 2300 0.0124 -
0.0549 2350 0.0004 -
0.0561 2400 0.0125 -
0.0572 2450 0.0001 -
0.0584 2500 0.0002 -
0.0596 2550 0.0002 -
0.0607 2600 0.0001 -
0.0619 2650 0.0002 -
0.0631 2700 0.0002 -
0.0643 2750 0.0243 -
0.0654 2800 0.0001 -
0.0666 2850 0.0001 -
0.0678 2900 0.0001 -
0.0689 2950 0.0002 -
0.0701 3000 0.006 -
0.0713 3050 0.0021 -
0.0724 3100 0.0003 -
0.0736 3150 0.0003 -
0.0748 3200 0.0001 -
0.0759 3250 0.0 -
0.0771 3300 0.0002 -
0.0783 3350 0.0001 -
0.0794 3400 0.0 -
0.0806 3450 0.0124 -
0.0818 3500 0.0001 -
0.0829 3550 0.0001 -
0.0841 3600 0.0001 -
0.0853 3650 0.0 -
0.0864 3700 0.0042 -
0.0876 3750 0.0001 -
0.0888 3800 0.0004 -
0.0900 3850 0.0001 -
0.0911 3900 0.0 -
0.0923 3950 0.004 -
0.0935 4000 0.0002 -
0.0946 4050 0.0001 -
0.0958 4100 0.0001 -
0.0970 4150 0.0 -
0.0981 4200 0.0 -
0.0993 4250 0.0008 -
0.1005 4300 0.0 -
0.1016 4350 0.0 -
0.1028 4400 0.0 -
0.1040 4450 0.0 -
0.1051 4500 0.0 -
0.1063 4550 0.0 -
0.1075 4600 0.0 -
0.1086 4650 0.0 -
0.1098 4700 0.0 -
0.1110 4750 0.0 -
0.1121 4800 0.0 -
0.1133 4850 0.0 -
0.1145 4900 0.0 -
0.1157 4950 0.0 -
0.1168 5000 0.0 -
0.1180 5050 0.0 -
0.1192 5100 0.0 -
0.1203 5150 0.0008 -
0.1215 5200 0.001 -
0.1227 5250 0.0 -
0.1238 5300 0.0 -
0.1250 5350 0.0057 -
0.1262 5400 0.0014 -
0.1273 5450 0.0001 -
0.1285 5500 0.0001 -
0.1297 5550 0.0001 -
0.1308 5600 0.0001 -
0.1320 5650 0.0001 -
0.1332 5700 0.0 -
0.1343 5750 0.0 -
0.1355 5800 0.0004 -
0.1367 5850 0.0 -
0.1378 5900 0.0001 -
0.1390 5950 0.0 -
0.1402 6000 0.0 -
0.1414 6050 0.0 -
0.1425 6100 0.0 -
0.1437 6150 0.0 -
0.1449 6200 0.0 -
0.1460 6250 0.0 -
0.1472 6300 0.0 -
0.1484 6350 0.0 -
0.1495 6400 0.0 -
0.1507 6450 0.0 -
0.1519 6500 0.0 -
0.1530 6550 0.0 -
0.1542 6600 0.0 -
0.1554 6650 0.0 -
0.1565 6700 0.0 -
0.1577 6750 0.0 -
0.1589 6800 0.0 -
0.1600 6850 0.0 -
0.1612 6900 0.0 -
0.1624 6950 0.0 -
0.1635 7000 0.0 -
0.1647 7050 0.0 -
0.1659 7100 0.0 -
0.1671 7150 0.0 -
0.1682 7200 0.0 -
0.1694 7250 0.0 -
0.1706 7300 0.0 -
0.1717 7350 0.0 -
0.1729 7400 0.0 -
0.1741 7450 0.0 -
0.1752 7500 0.0 -
0.1764 7550 0.0 -
0.1776 7600 0.0 -
0.1787 7650 0.0 -
0.1799 7700 0.0 -
0.1811 7750 0.0 -
0.1822 7800 0.0 -
0.1834 7850 0.0 -
0.1846 7900 0.0 -
0.1857 7950 0.0 -
0.1869 8000 0.0 -
0.1881 8050 0.0 -
0.1892 8100 0.0 -
0.1904 8150 0.0 -
0.1916 8200 0.0 -
0.1928 8250 0.0 -
0.1939 8300 0.0 -
0.1951 8350 0.0 -
0.1963 8400 0.0127 -
0.1974 8450 0.0001 -
0.1986 8500 0.0 -
0.1998 8550 0.0 -
0.2009 8600 0.0249 -
0.2021 8650 0.0003 -
0.2033 8700 0.0 -
0.2044 8750 0.0003 -
0.2056 8800 0.0003 -
0.2068 8850 0.0002 -
0.2079 8900 0.0 -
0.2091 8950 0.0 -
0.2103 9000 0.0001 -
0.2114 9050 0.0 -
0.2126 9100 0.0 -
0.2138 9150 0.0 -
0.2149 9200 0.0 -
0.2161 9250 0.0 -
0.2173 9300 0.0 -
0.2185 9350 0.0 -
0.2196 9400 0.0 -
0.2208 9450 0.0 -
0.2220 9500 0.0 -
0.2231 9550 0.0 -
0.2243 9600 0.0 -
0.2255 9650 0.0 -
0.2266 9700 0.0 -
0.2278 9750 0.0 -
0.2290 9800 0.0 -
0.2301 9850 0.0 -
0.2313 9900 0.0 -
0.2325 9950 0.0 -
0.2336 10000 0.0 -
0.2348 10050 0.0 -
0.2360 10100 0.0 -
0.2371 10150 0.0 -
0.2383 10200 0.0 -
0.2395 10250 0.0 -
0.2406 10300 0.0 -
0.2418 10350 0.0 -
0.2430 10400 0.0 -
0.2442 10450 0.0 -
0.2453 10500 0.0 -
0.2465 10550 0.0 -
0.2477 10600 0.0 -
0.2488 10650 0.0 -
0.2500 10700 0.0 -
0.2512 10750 0.0 -
0.2523 10800 0.0 -
0.2535 10850 0.0 -
0.2547 10900 0.0 -
0.2558 10950 0.0 -
0.2570 11000 0.0 -
0.2582 11050 0.0 -
0.2593 11100 0.0 -
0.2605 11150 0.0 -
0.2617 11200 0.0 -
0.2628 11250 0.0 -
0.2640 11300 0.0 -
0.2652 11350 0.0 -
0.2663 11400 0.0 -
0.2675 11450 0.0 -
0.2687 11500 0.0 -
0.2699 11550 0.0 -
0.2710 11600 0.0 -
0.2722 11650 0.0 -
0.2734 11700 0.0 -
0.2745 11750 0.0 -
0.2757 11800 0.0 -
0.2769 11850 0.0 -
0.2780 11900 0.0 -
0.2792 11950 0.0 -
0.2804 12000 0.0 -
0.2815 12050 0.0 -
0.2827 12100 0.0 -
0.2839 12150 0.0 -
0.2850 12200 0.0 -
0.2862 12250 0.0 -
0.2874 12300 0.0 -
0.2885 12350 0.0 -
0.2897 12400 0.0 -
0.2909 12450 0.0 -
0.2920 12500 0.0 -
0.2932 12550 0.0 -
0.2944 12600 0.0 -
0.2956 12650 0.0 -
0.2967 12700 0.0 -
0.2979 12750 0.0 -
0.2991 12800 0.0 -
0.3002 12850 0.0 -
0.3014 12900 0.0 -
0.3026 12950 0.0 -
0.3037 13000 0.0 -
0.3049 13050 0.0 -
0.3061 13100 0.0 -
0.3072 13150 0.0 -
0.3084 13200 0.0 -
0.3096 13250 0.0 -
0.3107 13300 0.0 -
0.3119 13350 0.0 -
0.3131 13400 0.0 -
0.3142 13450 0.0 -
0.3154 13500 0.0 -
0.3166 13550 0.0 -
0.3177 13600 0.0 -
0.3189 13650 0.0 -
0.3201 13700 0.0 -
0.3213 13750 0.0 -
0.3224 13800 0.0 -
0.3236 13850 0.0 -
0.3248 13900 0.0 -
0.3259 13950 0.0 -
0.3271 14000 0.0 -
0.3283 14050 0.0 -
0.3294 14100 0.0 -
0.3306 14150 0.0 -
0.3318 14200 0.0 -
0.3329 14250 0.0 -
0.3341 14300 0.0 -
0.3353 14350 0.0 -
0.3364 14400 0.0 -
0.3376 14450 0.0 -
0.3388 14500 0.0 -
0.3399 14550 0.0 -
0.3411 14600 0.0 -
0.3423 14650 0.0 -
0.3434 14700 0.0 -
0.3446 14750 0.0 -
0.3458 14800 0.0 -
0.3470 14850 0.0 -
0.3481 14900 0.0 -
0.3493 14950 0.0 -
0.3505 15000 0.0 -
0.3516 15050 0.0 -
0.3528 15100 0.0 -
0.3540 15150 0.0 -
0.3551 15200 0.0 -
0.3563 15250 0.0 -
0.3575 15300 0.0 -
0.3586 15350 0.0 -
0.3598 15400 0.0 -
0.3610 15450 0.0 -
0.3621 15500 0.0 -
0.3633 15550 0.0 -
0.3645 15600 0.0 -
0.3656 15650 0.0 -
0.3668 15700 0.0 -
0.3680 15750 0.0 -
0.3692 15800 0.0 -
0.3703 15850 0.0 -
0.3715 15900 0.0 -
0.3727 15950 0.0 -
0.3738 16000 0.0 -
0.3750 16050 0.0 -
0.3762 16100 0.0 -
0.3773 16150 0.0 -
0.3785 16200 0.0 -
0.3797 16250 0.0 -
0.3808 16300 0.0 -
0.3820 16350 0.0 -
0.3832 16400 0.0 -
0.3843 16450 0.0 -
0.3855 16500 0.0 -
0.3867 16550 0.0 -
0.3878 16600 0.0 -
0.3890 16650 0.0 -
0.3902 16700 0.0 -
0.3913 16750 0.0 -
0.3925 16800 0.0 -
0.3937 16850 0.0 -
0.3949 16900 0.0 -
0.3960 16950 0.0 -
0.3972 17000 0.0 -
0.3984 17050 0.0 -
0.3995 17100 0.0 -
0.4007 17150 0.0 -
0.4019 17200 0.0 -
0.4030 17250 0.0 -
0.4042 17300 0.0 -
0.4054 17350 0.0 -
0.4065 17400 0.0 -
0.4077 17450 0.031 -
0.4089 17500 0.1234 -
0.4100 17550 0.0569 -
0.4112 17600 0.0006 -
0.4124 17650 0.0003 -
0.4135 17700 0.0007 -
0.4147 17750 0.0002 -
0.4159 17800 0.025 -
0.4170 17850 0.0032 -
0.4182 17900 0.0 -
0.4194 17950 0.0 -
0.4206 18000 0.0 -
0.4217 18050 0.0 -
0.4229 18100 0.0002 -
0.4241 18150 0.0 -
0.4252 18200 0.0 -
0.4264 18250 0.0 -
0.4276 18300 0.0002 -
0.4287 18350 0.0001 -
0.4299 18400 0.0 -
0.4311 18450 0.0002 -
0.4322 18500 0.0001 -
0.4334 18550 0.0 -
0.4346 18600 0.0098 -
0.4357 18650 0.0 -
0.4369 18700 0.0001 -
0.4381 18750 0.0 -
0.4392 18800 0.0001 -
0.4404 18850 0.0 -
0.4416 18900 0.0 -
0.4427 18950 0.0001 -
0.4439 19000 0.0 -
0.4451 19050 0.0 -
0.4463 19100 0.0 -
0.4474 19150 0.0 -
0.4486 19200 0.0 -
0.4498 19250 0.0 -
0.4509 19300 0.0 -
0.4521 19350 0.0 -
0.4533 19400 0.0 -
0.4544 19450 0.0 -
0.4556 19500 0.0 -
0.4568 19550 0.0 -
0.4579 19600 0.0 -
0.4591 19650 0.0001 -
0.4603 19700 0.0284 -
0.4614 19750 0.0 -
0.4626 19800 0.0 -
0.4638 19850 0.0 -
0.4649 19900 0.0 -
0.4661 19950 0.0 -
0.4673 20000 0.0 -
0.4684 20050 0.0 -
0.4696 20100 0.0 -
0.4708 20150 0.0 -
0.4720 20200 0.0 -
0.4731 20250 0.0 -
0.4743 20300 0.0 -
0.4755 20350 0.0 -
0.4766 20400 0.0 -
0.4778 20450 0.0 -
0.4790 20500 0.0 -
0.4801 20550 0.0 -
0.4813 20600 0.0 -
0.4825 20650 0.0 -
0.4836 20700 0.0317 -
0.4848 20750 0.0002 -
0.4860 20800 0.0002 -
0.4871 20850 0.0 -
0.4883 20900 0.0 -
0.4895 20950 0.0 -
0.4906 21000 0.0 -
0.4918 21050 0.0 -
0.4930 21100 0.0002 -
0.4941 21150 0.0002 -
0.4953 21200 0.0 -
0.4965 21250 0.0 -
0.4977 21300 0.0 -
0.4988 21350 0.0 -
0.5000 21400 0.0 -
0.5012 21450 0.0 -
0.5023 21500 0.0 -
0.5035 21550 0.0 -
0.5047 21600 0.0 -
0.5058 21650 0.0001 -
0.5070 21700 0.0 -
0.5082 21750 0.0 -
0.5093 21800 0.0 -
0.5105 21850 0.0 -
0.5117 21900 0.0 -
0.5128 21950 0.0 -
0.5140 22000 0.0 -
0.5152 22050 0.0 -
0.5163 22100 0.0 -
0.5175 22150 0.0 -
0.5187 22200 0.0 -
0.5198 22250 0.0 -
0.5210 22300 0.0 -
0.5222 22350 0.0 -
0.5234 22400 0.0 -
0.5245 22450 0.0 -
0.5257 22500 0.0 -
0.5269 22550 0.0 -
0.5280 22600 0.0 -
0.5292 22650 0.0 -
0.5304 22700 0.0 -
0.5315 22750 0.0 -
0.5327 22800 0.0 -
0.5339 22850 0.0 -
0.5350 22900 0.0 -
0.5362 22950 0.0 -
0.5374 23000 0.0 -
0.5385 23050 0.0 -
0.5397 23100 0.0 -
0.5409 23150 0.0 -
0.5420 23200 0.0 -
0.5432 23250 0.0 -
0.5444 23300 0.0 -
0.5455 23350 0.0 -
0.5467 23400 0.0 -
0.5479 23450 0.0 -
0.5491 23500 0.0 -
0.5502 23550 0.0 -
0.5514 23600 0.0 -
0.5526 23650 0.0 -
0.5537 23700 0.0 -
0.5549 23750 0.0 -
0.5561 23800 0.0 -
0.5572 23850 0.0 -
0.5584 23900 0.0 -
0.5596 23950 0.0 -
0.5607 24000 0.0 -
0.5619 24050 0.0 -
0.5631 24100 0.0 -
0.5642 24150 0.0 -
0.5654 24200 0.0 -
0.5666 24250 0.0 -
0.5677 24300 0.0 -
0.5689 24350 0.0 -
0.5701 24400 0.0 -
0.5712 24450 0.0 -
0.5724 24500 0.0 -
0.5736 24550 0.0 -
0.5748 24600 0.0 -
0.5759 24650 0.0 -
0.5771 24700 0.0 -
0.5783 24750 0.0 -
0.5794 24800 0.0 -
0.5806 24850 0.0 -
0.5818 24900 0.0 -
0.5829 24950 0.0 -
0.5841 25000 0.0 -
0.5853 25050 0.0 -
0.5864 25100 0.0 -
0.5876 25150 0.0 -
0.5888 25200 0.0 -
0.5899 25250 0.0 -
0.5911 25300 0.0 -
0.5923 25350 0.0 -
0.5934 25400 0.0 -
0.5946 25450 0.0 -
0.5958 25500 0.0 -
0.5969 25550 0.0 -
0.5981 25600 0.0 -
0.5993 25650 0.0 -
0.6005 25700 0.0 -
0.6016 25750 0.0 -
0.6028 25800 0.0 -
0.6040 25850 0.0 -
0.6051 25900 0.0 -
0.6063 25950 0.0 -
0.6075 26000 0.0 -
0.6086 26050 0.0 -
0.6098 26100 0.0 -
0.6110 26150 0.0 -
0.6121 26200 0.0 -
0.6133 26250 0.0 -
0.6145 26300 0.0 -
0.6156 26350 0.0 -
0.6168 26400 0.0 -
0.6180 26450 0.0 -
0.6191 26500 0.0 -
0.6203 26550 0.0 -
0.6215 26600 0.0 -
0.6226 26650 0.0 -
0.6238 26700 0.0 -
0.6250 26750 0.0 -
0.6262 26800 0.0 -
0.6273 26850 0.0 -
0.6285 26900 0.0 -
0.6297 26950 0.0 -
0.6308 27000 0.0 -
0.6320 27050 0.0 -
0.6332 27100 0.0 -
0.6343 27150 0.0 -
0.6355 27200 0.0 -
0.6367 27250 0.0 -
0.6378 27300 0.0 -
0.6390 27350 0.0 -
0.6402 27400 0.0 -
0.6413 27450 0.0 -
0.6425 27500 0.0 -
0.6437 27550 0.0 -
0.6448 27600 0.0 -
0.6460 27650 0.0 -
0.6472 27700 0.0 -
0.6483 27750 0.0 -
0.6495 27800 0.0 -
0.6507 27850 0.0 -
0.6519 27900 0.0 -
0.6530 27950 0.0 -
0.6542 28000 0.0 -
0.6554 28050 0.0 -
0.6565 28100 0.0 -
0.6577 28150 0.0 -
0.6589 28200 0.0 -
0.6600 28250 0.0 -
0.6612 28300 0.0 -
0.6624 28350 0.0 -
0.6635 28400 0.0 -
0.6647 28450 0.0 -
0.6659 28500 0.0 -
0.6670 28550 0.0 -
0.6682 28600 0.0 -
0.6694 28650 0.0 -
0.6705 28700 0.0 -
0.6717 28750 0.0 -
0.6729 28800 0.0 -
0.6740 28850 0.0 -
0.6752 28900 0.0 -
0.6764 28950 0.0 -
0.6776 29000 0.0 -
0.6787 29050 0.0 -
0.6799 29100 0.0 -
0.6811 29150 0.0 -
0.6822 29200 0.0 -
0.6834 29250 0.0 -
0.6846 29300 0.0 -
0.6857 29350 0.0 -
0.6869 29400 0.0 -
0.6881 29450 0.0 -
0.6892 29500 0.0 -
0.6904 29550 0.0 -
0.6916 29600 0.0 -
0.6927 29650 0.0 -
0.6939 29700 0.0 -
0.6951 29750 0.0 -
0.6962 29800 0.0 -
0.6974 29850 0.0 -
0.6986 29900 0.0 -
0.6998 29950 0.0 -
0.7009 30000 0.0 -
0.7021 30050 0.0 -
0.7033 30100 0.0 -
0.7044 30150 0.0 -
0.7056 30200 0.0 -
0.7068 30250 0.0 -
0.7079 30300 0.0 -
0.7091 30350 0.0 -
0.7103 30400 0.0 -
0.7114 30450 0.0 -
0.7126 30500 0.0 -
0.7138 30550 0.0 -
0.7149 30600 0.0 -
0.7161 30650 0.0 -
0.7173 30700 0.0 -
0.7184 30750 0.0 -
0.7196 30800 0.0 -
0.7208 30850 0.0 -
0.7219 30900 0.0 -
0.7231 30950 0.0 -
0.7243 31000 0.0 -
0.7255 31050 0.0 -
0.7266 31100 0.0 -
0.7278 31150 0.0 -
0.7290 31200 0.0 -
0.7301 31250 0.0 -
0.7313 31300 0.0 -
0.7325 31350 0.0 -
0.7336 31400 0.0 -
0.7348 31450 0.0 -
0.7360 31500 0.0 -
0.7371 31550 0.0 -
0.7383 31600 0.0 -
0.7395 31650 0.0 -
0.7406 31700 0.0 -
0.7418 31750 0.0316 -
0.7430 31800 0.0 -
0.7441 31850 0.0 -
0.7453 31900 0.0 -
0.7465 31950 0.0 -
0.7476 32000 0.0 -
0.7488 32050 0.0 -
0.7500 32100 0.0 -
0.7512 32150 0.0 -
0.7523 32200 0.0 -
0.7535 32250 0.0 -
0.7547 32300 0.0 -
0.7558 32350 0.0 -
0.7570 32400 0.0 -
0.7582 32450 0.0 -
0.7593 32500 0.0 -
0.7605 32550 0.0 -
0.7617 32600 0.0 -
0.7628 32650 0.0 -
0.7640 32700 0.0 -
0.7652 32750 0.0 -
0.7663 32800 0.0 -
0.7675 32850 0.0 -
0.7687 32900 0.0 -
0.7698 32950 0.0 -
0.7710 33000 0.0 -
0.7722 33050 0.0 -
0.7733 33100 0.0 -
0.7745 33150 0.0 -
0.7757 33200 0.0 -
0.7769 33250 0.0 -
0.7780 33300 0.0 -
0.7792 33350 0.0 -
0.7804 33400 0.0 -
0.7815 33450 0.0 -
0.7827 33500 0.0 -
0.7839 33550 0.0 -
0.7850 33600 0.0 -
0.7862 33650 0.0 -
0.7874 33700 0.0 -
0.7885 33750 0.0 -
0.7897 33800 0.0 -
0.7909 33850 0.0 -
0.7920 33900 0.0 -
0.7932 33950 0.0 -
0.7944 34000 0.0 -
0.7955 34050 0.0 -
0.7967 34100 0.0 -
0.7979 34150 0.0 -
0.7990 34200 0.0 -
0.8002 34250 0.0 -
0.8014 34300 0.0 -
0.8026 34350 0.0 -
0.8037 34400 0.0 -
0.8049 34450 0.0 -
0.8061 34500 0.0 -
0.8072 34550 0.0 -
0.8084 34600 0.0 -
0.8096 34650 0.0 -
0.8107 34700 0.0 -
0.8119 34750 0.0 -
0.8131 34800 0.0 -
0.8142 34850 0.0 -
0.8154 34900 0.0 -
0.8166 34950 0.0 -
0.8177 35000 0.0 -
0.8189 35050 0.0 -
0.8201 35100 0.0 -
0.8212 35150 0.0 -
0.8224 35200 0.0 -
0.8236 35250 0.0 -
0.8247 35300 0.0009 -
0.8259 35350 0.0 -
0.8271 35400 0.0 -
0.8283 35450 0.0 -
0.8294 35500 0.0 -
0.8306 35550 0.0 -
0.8318 35600 0.0 -
0.8329 35650 0.0 -
0.8341 35700 0.0 -
0.8353 35750 0.0001 -
0.8364 35800 0.0 -
0.8376 35850 0.0 -
0.8388 35900 0.0 -
0.8399 35950 0.0 -
0.8411 36000 0.0 -
0.8423 36050 0.0 -
0.8434 36100 0.0 -
0.8446 36150 0.0 -
0.8458 36200 0.0 -
0.8469 36250 0.0 -
0.8481 36300 0.0 -
0.8493 36350 0.0 -
0.8504 36400 0.0 -
0.8516 36450 0.0 -
0.8528 36500 0.0 -
0.8540 36550 0.0 -
0.8551 36600 0.0 -
0.8563 36650 0.0 -
0.8575 36700 0.0 -
0.8586 36750 0.0 -
0.8598 36800 0.0 -
0.8610 36850 0.0 -
0.8621 36900 0.0 -
0.8633 36950 0.0 -
0.8645 37000 0.0 -
0.8656 37050 0.0 -
0.8668 37100 0.0 -
0.8680 37150 0.0 -
0.8691 37200 0.0 -
0.8703 37250 0.0 -
0.8715 37300 0.0 -
0.8726 37350 0.0 -
0.8738 37400 0.0 -
0.8750 37450 0.0 -
0.8761 37500 0.0 -
0.8773 37550 0.0 -
0.8785 37600 0.0 -
0.8797 37650 0.0 -
0.8808 37700 0.0 -
0.8820 37750 0.0 -
0.8832 37800 0.0 -
0.8843 37850 0.0 -
0.8855 37900 0.0 -
0.8867 37950 0.0 -
0.8878 38000 0.0 -
0.8890 38050 0.0 -
0.8902 38100 0.0 -
0.8913 38150 0.0 -
0.8925 38200 0.0 -
0.8937 38250 0.0 -
0.8948 38300 0.0 -
0.8960 38350 0.0 -
0.8972 38400 0.0 -
0.8983 38450 0.0 -
0.8995 38500 0.0 -
0.9007 38550 0.0 -
0.9018 38600 0.0 -
0.9030 38650 0.0 -
0.9042 38700 0.0 -
0.9054 38750 0.0 -
0.9065 38800 0.0 -
0.9077 38850 0.0 -
0.9089 38900 0.0 -
0.9100 38950 0.0 -
0.9112 39000 0.0 -
0.9124 39050 0.0 -
0.9135 39100 0.0 -
0.9147 39150 0.0 -
0.9159 39200 0.0 -
0.9170 39250 0.0 -
0.9182 39300 0.0 -
0.9194 39350 0.0 -
0.9205 39400 0.0 -
0.9217 39450 0.0 -
0.9229 39500 0.0 -
0.9240 39550 0.0 -
0.9252 39600 0.0 -
0.9264 39650 0.0 -
0.9275 39700 0.0 -
0.9287 39750 0.0 -
0.9299 39800 0.0 -
0.9311 39850 0.0 -
0.9322 39900 0.0 -
0.9334 39950 0.0 -
0.9346 40000 0.0 -
0.9357 40050 0.0 -
0.9369 40100 0.0 -
0.9381 40150 0.0 -
0.9392 40200 0.0 -
0.9404 40250 0.0 -
0.9416 40300 0.0 -
0.9427 40350 0.0 -
0.9439 40400 0.0 -
0.9451 40450 0.0 -
0.9462 40500 0.0 -
0.9474 40550 0.0 -
0.9486 40600 0.0 -
0.9497 40650 0.0 -
0.9509 40700 0.0 -
0.9521 40750 0.0 -
0.9532 40800 0.0 -
0.9544 40850 0.0 -
0.9556 40900 0.0 -
0.9568 40950 0.0 -
0.9579 41000 0.0 -
0.9591 41050 0.0 -
0.9603 41100 0.0 -
0.9614 41150 0.0 -
0.9626 41200 0.0 -
0.9638 41250 0.0 -
0.9649 41300 0.0 -
0.9661 41350 0.0 -
0.9673 41400 0.0 -
0.9684 41450 0.0 -
0.9696 41500 0.0 -
0.9708 41550 0.0 -
0.9719 41600 0.0 -
0.9731 41650 0.0 -
0.9743 41700 0.0 -
0.9754 41750 0.0 -
0.9766 41800 0.0 -
0.9778 41850 0.0 -
0.9789 41900 0.0 -
0.9801 41950 0.0 -
0.9813 42000 0.0 -
0.9825 42050 0.0 -
0.9836 42100 0.0 -
0.9848 42150 0.0 -
0.9860 42200 0.0 -
0.9871 42250 0.0 -
0.9883 42300 0.0 -
0.9895 42350 0.0 -
0.9906 42400 0.0 -
0.9918 42450 0.0 -
0.9930 42500 0.0 -
0.9941 42550 0.0 -
0.9953 42600 0.0 -
0.9965 42650 0.0 -
0.9976 42700 0.0 -
0.9988 42750 0.0 -
1.0000 42800 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.5.1
  • Transformers: 4.38.2
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.18.0
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
18
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jamiehudson/725_model_v5

Finetuned
(256)
this model

Evaluation results