HSIMAE: A Unified Masked Autoencoder with large-scale pretraining for Hyperspectral Image Classification
β¨ Highlights
Large-Scale and Diverse Dataset for HSI Pretraining
A large and diverse HSI dataset named HSIHybrid was curated for large-scale HSI pre-training. It consisted of 15 HSI datasets from different hyperspectral sensors. After splitting into image patches, a total of 4 million HSI patches with a spatial size of 9Γ9 were obtained.
New MAE Architecture for HSI domain
A modified MAE named HSIMAE that utilized separate spatial-spectral encoders followed by fusion blocks to learn spatial correlation and spectral correlation of HSI data was proposed.
Dual-branch finetuning to leverage unlabeled data of target dataset
A dual-branch fine-tuning framework was introduced to leverage the unlabeled data of the downstream HSI dataset and suppressed overfitting on small training samples.
π§βπ» Contact
Wang Yue
E-mail: [email protected]
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.