Upload 4 files
3092fc0
best_swin_RoBERTa.pt
Detected Pickle imports (34)
- "torch.nn.modules.normalization.LayerNorm",
- "torch.nn.parallel.data_parallel.DataParallel",
- "torch.nn.modules.activation.Softmax",
- "torch.LongStorage",
- "collections.defaultdict",
- "torch.cuda.amp.grad_scaler._refresh_per_optimizer_state",
- "torch.nn.modules.activation.GELU",
- "modules.SwinTransformer.Swin_Transformer.PatchMerging",
- "torch.float32",
- "torch.nn.modules.batchnorm.BatchNorm1d",
- "torch.nn.modules.linear.Linear",
- "modules.SwinTransformer.Swin_Transformer.PatchEmbed",
- "torch.nn.modules.linear.Identity",
- "modules.SwinTransformer.Swin_Transformer.Flatten",
- "torch.nn.modules.activation.ReLU",
- "torch.cuda.amp.grad_scaler.GradScaler",
- "timm.layers.drop.DropPath",
- "torch._utils._rebuild_tensor_v2",
- "modules.SwinTransformer.Swin_Transformer.SwinTransformerBlock",
- "src.models.SwinForAffwildClassification",
- "lightning_lite.wrappers._LiteModule",
- "modules.SwinTransformer.Swin_Transformer.WindowAttention",
- "torch.FloatStorage",
- "torch.nn.modules.container.ModuleList",
- "modules.SwinTransformer.Swin_Transformer.Mlp",
- "modules.SwinTransformer.Swin_Transformer.SwinTransformer",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.conv.Conv2d",
- "torch.nn.modules.container.Sequential",
- "torch.nn.modules.dropout.Dropout",
- "modules.SwinTransformer.Swin_Transformer.BasicLayer",
- "collections.OrderedDict",
- "torch.device",
- "lightning_lite.plugins.precision.native_amp.NativeMixedPrecision"
How to fix it?
188 MB
Upload 4 files
multimodal_T+A+V_RoBERTa.pt
Detected Pickle imports (47)
- "modules.CrossmodalTransformer.CrossModalTransformerEncoder",
- "torch.nn.modules.normalization.LayerNorm",
- "torch.nn.parallel.data_parallel.DataParallel",
- "modules.Transformer.Output_Residual_Norm",
- "torch.LongStorage",
- "src.models.MultiModalTransformerForClassification",
- "torch.nn.modules.activation.Tanh",
- "collections.defaultdict",
- "torch.cuda.amp.grad_scaler._refresh_per_optimizer_state",
- "torch.nn.modules.sparse.Embedding",
- "modules.CrossmodalTransformer.TransformerEncoderLayer",
- "torch.float32",
- "transformers.models.roberta.modeling_roberta.RobertaModel",
- "torch.nn.modules.linear.Linear",
- "transformers.models.roberta.modeling_roberta.RobertaEncoder",
- "modules.Transformer.Residual_Norm",
- "modules.Transformer.MultiHeadSelfAttention",
- "transformers.models.roberta.modeling_roberta.RobertaSelfOutput",
- "torch.cuda.amp.grad_scaler.GradScaler",
- "modules.Transformer.gelu",
- "torch._utils._rebuild_tensor_v2",
- "modules.Transformer.AdditiveAttention",
- "transformers.models.roberta.modeling_roberta.RobertaEmbeddings",
- "transformers.models.roberta.modeling_roberta.RobertaLayer",
- "transformers.models.roberta.modeling_roberta.RobertaIntermediate",
- "lightning_lite.wrappers._LiteModule",
- "modules.Transformer.LayerNorm",
- "modules.multihead_attention.MultiheadAttention",
- "transformers.models.roberta.modeling_roberta.RobertaOutput",
- "torch.FloatStorage",
- "torch.nn.modules.container.ModuleList",
- "transformers.models.roberta.modeling_roberta.RobertaPooler",
- "modules.Transformer.SelfAttention",
- "modules.position_embedding.SinusoidalPositionalEmbedding",
- "transformers.activations.GELUActivation",
- "torch._utils._rebuild_parameter",
- "modules.Transformer.TransformerEnoderLayer",
- "transformers.models.roberta.modeling_roberta.RobertaAttention",
- "torch._C._nn.gelu",
- "torch.nn.modules.dropout.Dropout",
- "modules.Transformer.TransformerIntermediate",
- "collections.OrderedDict",
- "torch.device",
- "transformers.models.roberta.modeling_roberta.RobertaSelfAttention",
- "lightning_lite.plugins.precision.native_amp.NativeMixedPrecision",
- "transformers.models.roberta.configuration_roberta.RobertaConfig",
- "modules.Transformer.MELDTransEncoder"
How to fix it?
1.75 GB
Upload 4 files