Edit model card

🦾 Heterogenous Pre-trained Transformers

Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He

Neural Information Processing Systems (Spotlight), 2024

Paper: https://huggingface.co./papers/2409.20537

You can find more details on our project page. An alternative clean implementation of HPT in Hugging Face can also be found here.

TL;DR: HPT aligns different embodiment to a shared latent space and investigates the scaling behaviors in policy learning. Put a scalable transformer in the middle of your policy and don’t train from scratch!

If you find HPT useful in your research, please consider citing:

@inproceedings{wang2024hpt,
author    = {Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He},
title     = {Scaling Proprioceptive-Visual Learning with Heterogeneous Pre-trained Transformers},
booktitle = {Neurips},
year      = {2024}
}

Contact

If you have any questions, feel free to contact me through email ([email protected]). Enjoy!

Downloads last month
262
Safetensors
Model size
12.6M params
Tensor type
F32
·
Video Preview
loading