soldni commited on
Commit
2462f50
1 Parent(s): 91d7221

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -28,7 +28,7 @@ It has state-of-the-art performance among multimodal models with a similar size
28
  You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
29
  **Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog).
30
 
31
- MolmoE-1B is a multimodal Mixture-of-Experts LLM with 1.5B active and 7.2B total parameters released in September 2024 (0924) based on [OLMoE-1B-7B-0924](https://huggingface.co/allenai/OLMoE-1B-7B-0924).
32
  It nearly matches the performance of GPT-4V on both academic benchmarks and human evaluation, and achieves state-of-the-art performance among similarly-sized open multimodal models.
33
 
34
  This checkpoint is a **preview** of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
@@ -100,9 +100,9 @@ print(generated_text)
100
  | Model | Average Score on 11 Academic Benchmarks | Human Preference Elo Rating |
101
  |-----------------------------|-----------------------------------------|-----------------------------|
102
  | Molmo 72B | 81.2 | 1077 |
103
- | **Molmo 7B-D** (this model) | **77.3** | **1056** |
104
  | Molmo 7B-O | 74.6 | 1051 |
105
- | MolmoE 1B | 68.6 | 1032 |
106
  | GPT-4o | 78.5 | 1079 |
107
  | GPT-4V | 71.1 | 1041 |
108
  | Gemini 1.5 Pro | 78.3 | 1074 |
 
28
  You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
29
  **Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog).
30
 
31
+ MolmoE-1B is a multimodal Mixture-of-Experts LLM with 1.5B active and 7.2B total parameters based on [OLMoE-1B-7B-0924](https://huggingface.co/allenai/OLMoE-1B-7B-0924).
32
  It nearly matches the performance of GPT-4V on both academic benchmarks and human evaluation, and achieves state-of-the-art performance among similarly-sized open multimodal models.
33
 
34
  This checkpoint is a **preview** of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
 
100
  | Model | Average Score on 11 Academic Benchmarks | Human Preference Elo Rating |
101
  |-----------------------------|-----------------------------------------|-----------------------------|
102
  | Molmo 72B | 81.2 | 1077 |
103
+ | Molmo 7B-D | 77.3 | 1056 |
104
  | Molmo 7B-O | 74.6 | 1051 |
105
+ | **MolmoE 1B (this model)** | **68.6** | **1032** |
106
  | GPT-4o | 78.5 | 1079 |
107
  | GPT-4V | 71.1 | 1041 |
108
  | Gemini 1.5 Pro | 78.3 | 1074 |