Image-Text-to-Text
Safetensors
chameleon

Meta Chameleon 7B

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.

The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.

[more details and usage examples coming soon]

Citation

To cite the paper, model, or software, please use the below:

@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
  author = {Chameleon Team},
  doi = {10.48550/arXiv.2405.09818},
  journal = {arXiv preprint arXiv:2405.09818},
  title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
  url = {https://github.com/facebookresearch/chameleon},
  year = {2024}
}

License

Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.

Downloads last month
0
Safetensors
Model size
7.04B params
Tensor type
F32
·
BF16
·
Inference API
Unable to determine this model's library. Check the docs .