Model size
#60
by
Juli784
- opened
How large is the model size (million parameters) of BAAI/bge-m3?
bge-m3 has simialr parameters with bert-large for transformers layers, but it has more embedding for token (250k tokens in vocab). Overall, bge-m3 has 560M parameters.