qazimbhat1
commited on
Commit
•
aa3ff79
1
Parent(s):
17f43ab
Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,6 @@ CrystalChat-7B based multi-modal large language model (MLLM) mimics the training
|
|
24 |
* Trained in 2 stages
|
25 |
* License: MIT
|
26 |
|
27 |
-
Crystal-based models were developed as a collaboration between [MBZUAI](https://mbzuai.ac.ae/institute-of-foundation-models/), [Petuum](https://www.petuum.com/), and [LLM360](https://www.llm360.ai/) TODO- check????.
|
28 |
|
29 |
|
30 |
## Evaluation
|
@@ -126,9 +125,10 @@ We believe in a future where artificial general intelligence (AGI) is created by
|
|
126 |
**BibTeX:**
|
127 |
|
128 |
```bibtex
|
129 |
-
@article{
|
130 |
-
|
131 |
-
|
132 |
-
|
|
|
133 |
}
|
134 |
```
|
|
|
24 |
* Trained in 2 stages
|
25 |
* License: MIT
|
26 |
|
|
|
27 |
|
28 |
|
29 |
## Evaluation
|
|
|
125 |
**BibTeX:**
|
126 |
|
127 |
```bibtex
|
128 |
+
@article{yun2024web2code,
|
129 |
+
title={Web2Code: A Large-scale Webpage-to-Code Dataset and Evaluation Framework for Multimodal LLMs},
|
130 |
+
author={Yun, Sukmin and Lin, Haokun and Thushara, Rusiru and Bhat, Mohammad Qazim and Wang, Yongxin and Jiang, Zutao and Deng, Mingkai and Wang, Jinhong and Tao, Tianhua and Li, Junbo and others},
|
131 |
+
journal={arXiv preprint arXiv:2406.20098},
|
132 |
+
year={2024}
|
133 |
}
|
134 |
```
|