Update README.md
Browse files
README.md
CHANGED
@@ -6,19 +6,19 @@ tags:
|
|
6 |
---
|
7 |
# HSIMAE: A Unified Masked Autoencoder with large-scale pretraining for Hyperspectral Image Classification
|
8 |
|
9 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65979dfeb4b5c254cb8ed20e/
|
10 |
|
11 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65979dfeb4b5c254cb8ed20e/
|
12 |
|
13 |
## ✨ Highlights
|
14 |
-
###
|
15 |
-
|
16 |
|
17 |
-
###
|
18 |
-
|
19 |
|
20 |
### Dual-branch finetuning to leverage unlabeled data of target dataset
|
21 |
-
|
22 |
|
23 |
## 🧑💻 Contact
|
24 |
|
|
|
6 |
---
|
7 |
# HSIMAE: A Unified Masked Autoencoder with large-scale pretraining for Hyperspectral Image Classification
|
8 |
|
9 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65979dfeb4b5c254cb8ed20e/YjbxlXg5el3nySkcQkmq_.png)
|
10 |
|
11 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65979dfeb4b5c254cb8ed20e/dxLbojSBr4Kdt-cgA3su7.png)
|
12 |
|
13 |
## ✨ Highlights
|
14 |
+
### Large-Scale and Diverse Dataset for HSI Pretraining
|
15 |
+
A large and diverse HSI dataset named HSIHybrid was curated for large-scale HSI pre-training. It consisted of 15 HSI datasets from different hyperspectral sensors. After splitting into image patches, a total of **4 million** HSI patches with a spatial size of 9×9 were obtained.
|
16 |
|
17 |
+
### New MAE Architecture for HSI domain
|
18 |
+
A modified MAE named HSIMAE that utilized separate spatial-spectral encoders followed by fusion blocks to learn spatial correlation and spectral correlation of HSI data was proposed.
|
19 |
|
20 |
### Dual-branch finetuning to leverage unlabeled data of target dataset
|
21 |
+
A dual-branch fine-tuning framework was introduced to leverage the unlabeled data of the downstream HSI dataset and suppressed overfitting on small training samples.
|
22 |
|
23 |
## 🧑💻 Contact
|
24 |
|