Update README.md
817f629
verified
-
1.52 kB
initial commit
-
94 Bytes
Update README.md
best_model_maxvit_small_tf_512.in1k_6_epoch.pt
Detected Pickle imports (37)
- "torch.nn.modules.dropout.Dropout",
- "timm.models.maxxvit.MbConvBlock",
- "torch.nn.modules.linear.Identity",
- "torch.nn.parallel.data_parallel.DataParallel",
- "timm.layers.conv2d_same.Conv2dSame",
- "collections.OrderedDict",
- "torch._utils._rebuild_parameter",
- "timm.layers.norm.LayerNorm",
- "timm.models.maxxvit.Stem",
- "timm.layers.classifier.NormMlpClassifierHead",
- "timm.layers.adaptive_avgmax_pool.SelectAdaptivePool2d",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.conv.Conv2d",
- "torch.nn.modules.linear.Linear",
- "torch.nn.modules.pooling.AdaptiveAvgPool2d",
- "torch.nn.modules.container.Sequential",
- "timm.layers.activations.GELUTanh",
- "timm.layers.pos_embed_rel.RelPosBiasTf",
- "torch.FloatStorage",
- "timm.layers.norm.LayerNorm2d",
- "timm.layers.activations.Tanh",
- "timm.models.maxxvit.MaxxVit",
- "timm.layers.mlp.Mlp",
- "timm.layers.pool2d_same.AvgPool2dSame",
- "torch.device",
- "timm.layers.squeeze_excite.SEModule",
- "torch.nn.modules.flatten.Flatten",
- "timm.models.maxxvit.Downsample2d",
- "timm.layers.norm_act.BatchNormAct2d",
- "timm.models.maxxvit.PartitionAttentionCl",
- "timm.layers.activations.Sigmoid",
- "timm.models.maxxvit.MaxxVitStage",
- "__builtin__.set",
- "timm.models.maxxvit.AttentionCl",
- "torch.LongStorage",
- "timm.models.maxxvit.MaxxVitBlock",
- "torch.nn.modules.activation.SiLU"
How to fix it?
275 MB
Upload best_model_maxvit_small_tf_512.in1k_6_epoch.pt