Add metadata tags and link to code
#1
by
nielsr
HF staff
- opened
README.md
CHANGED
@@ -1,6 +1,9 @@
|
|
1 |
---
|
2 |
license: mit
|
|
|
|
|
3 |
---
|
|
|
4 |
# mmMamba-linear Model Card
|
5 |
|
6 |
## Introduction
|
@@ -11,11 +14,13 @@ Distilled from the decoder-only HoVLE-2.6B, our pure Mamba-2-based mmMamba-linea
|
|
11 |
<div align="center">
|
12 |
<img src="teaser.png" />
|
13 |
|
14 |
-
|
15 |
<b>Seeding strategy and three-stage distillation pipeline of mmMamba.</b>
|
16 |
<img src="pipeline.png" />
|
17 |
</div>
|
18 |
|
|
|
|
|
|
|
19 |
## Quick Start Guide for mmMamba Inference
|
20 |
|
21 |
We provide example code to run mmMamba inference using the Transformers library.
|
@@ -41,7 +46,6 @@ Below are the primary dependencies required for model inference:
|
|
41 |
- decord
|
42 |
- seaborn
|
43 |
|
44 |
-
|
45 |
### Inference with Transformers
|
46 |
|
47 |
```python
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
+
library_name: transformers
|
4 |
+
pipeline_tag: image-text-to-text
|
5 |
---
|
6 |
+
|
7 |
# mmMamba-linear Model Card
|
8 |
|
9 |
## Introduction
|
|
|
14 |
<div align="center">
|
15 |
<img src="teaser.png" />
|
16 |
|
|
|
17 |
<b>Seeding strategy and three-stage distillation pipeline of mmMamba.</b>
|
18 |
<img src="pipeline.png" />
|
19 |
</div>
|
20 |
|
21 |
+
Paper: [](https://hf.co/papers/2502.13145)
|
22 |
+
Code: https://github.com/Hongyuan-Tao/mmMamba
|
23 |
+
|
24 |
## Quick Start Guide for mmMamba Inference
|
25 |
|
26 |
We provide example code to run mmMamba inference using the Transformers library.
|
|
|
46 |
- decord
|
47 |
- seaborn
|
48 |
|
|
|
49 |
### Inference with Transformers
|
50 |
|
51 |
```python
|