|
Website: liruiw.github.io/hpt |
|
See the [HPT](https://github.com/liruiw/HPT-Pretrain) GitHub README and the [LeRobot](https://github.com/huggingface/lerobot) Implementation for instructions on how to use this checkpoint for fine-tuning. |
|
|
|
Citation |
|
BibTeX: |
|
|
|
@inproceedings{wang2024hpt, |
|
author={Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He, Russ Tedrake}, |
|
title={Scaling Proprioceptive-Visual Learning with Heterogeneous Pre-trained Transformers}, |
|
year={2024}, |
|
eprint={2407.16677}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.RO}, |
|
url={https://arxiv.org/abs/2407.16677}} |