Upload 2.pt
a0e62de
verified
-
1.52 kB
initial commit
2.pt
Detected Pickle imports (67)
- "__torch__.torch.nn.modules.conv.___torch_mangle_16.Conv1d",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_47.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_4.Linear",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_32.Dropout",
- "__torch__.torch.nn.modules.conv.___torch_mangle_41.Conv1d",
- "__torch__.torch.nn.modules.linear.___torch_mangle_13.Linear",
- "__torch__.___torch_mangle_27.AttentionLayer",
- "__torch__.torch.nn.modules.conv.___torch_mangle_29.Conv1d",
- "__torch__.torch.nn.modules.container.ModuleList",
- "__torch__.EncoderLayer",
- "__torch__.___torch_mangle_33.EncoderLayer",
- "__torch__.DataEmbedding_inverted",
- "__torch__.torch.nn.modules.linear.___torch_mangle_12.Linear",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_30.LayerNorm",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_45.Dropout",
- "torch.FloatStorage",
- "torch._utils._rebuild_tensor_v2",
- "__torch__.___torch_mangle_40.AttentionLayer",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_43.LayerNorm",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_21.Dropout",
- "__torch__.torch.nn.modules.linear.___torch_mangle_23.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_38.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_24.Linear",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_18.LayerNorm",
- "__torch__.AttentionLayer",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_7.Dropout",
- "__torch__.Model",
- "collections.OrderedDict",
- "__torch__.Encoder",
- "__torch__.___torch_mangle_20.EncoderLayer",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_19.Dropout",
- "__torch__.___torch_mangle_46.EncoderLayer",
- "__torch__.torch.nn.modules.conv.Conv1d",
- "__torch__.torch.nn.modules.linear.___torch_mangle_26.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_10.Linear",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_31.LayerNorm",
- "__torch__.___torch_mangle_22.FullAttention",
- "__torch__.___torch_mangle_9.FullAttention",
- "__torch__.torch.nn.modules.normalization.LayerNorm",
- "__torch__.torch.nn.modules.linear.Linear",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_0.Dropout",
- "__torch__.torch.nn.modules.linear.___torch_mangle_36.Linear",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_17.LayerNorm",
- "__torch__.torch.nn.modules.linear.___torch_mangle_37.Linear",
- "__torch__.torch.nn.modules.dropout.Dropout",
- "__torch__.torch.nn.modules.linear.___torch_mangle_2.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_48.Linear",
- "__torch__.___torch_mangle_14.AttentionLayer",
- "__torch__.torch.nn.modules.linear.___torch_mangle_1.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_25.Linear",
- "__torch__.torch.nn.modules.linear.___torch_mangle_3.Linear",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_34.Dropout",
- "__torch__.torch.nn.modules.linear.___torch_mangle_11.Linear",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_6.LayerNorm",
- "__torch__.torch.nn.modules.conv.___torch_mangle_42.Conv1d",
- "__torch__.torch.nn.modules.normalization.___torch_mangle_44.LayerNorm",
- "__torch__.torch.nn.modules.conv.___torch_mangle_5.Conv1d",
- "__torch__.FullAttention",
- "__torch__.torch.nn.modules.conv.___torch_mangle_28.Conv1d",
- "__torch__.torch.nn.modules.linear.___torch_mangle_39.Linear",
- "__torch__.torch.nn.modules.dropout.___torch_mangle_8.Dropout",
- "__torch__.___torch_mangle_35.FullAttention",
- "__torch__.torch.nn.modules.conv.___torch_mangle_15.Conv1d",
- "torch.jit._pickle.restore_type_tag",
- "collections.OrderedDict",
- "torch.FloatStorage",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
135 MB
Upload 2.pt
-
24 Bytes
initial commit
itransformer1.pt
Detected Pickle imports (17)
- "collections.OrderedDict",
- "torch.nn.modules.normalization.LayerNorm",
- "torch._utils._rebuild_parameter",
- "torch.FloatStorage",
- "torch._C._nn.gelu",
- "__main__.EncoderLayer",
- "__builtin__.set",
- "__main__.FullAttention",
- "torch.nn.modules.container.ModuleList",
- "__main__.Model",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.dropout.Dropout",
- "__main__.DataEmbedding_inverted",
- "__main__.Encoder",
- "torch.nn.modules.linear.Linear",
- "torch.nn.modules.conv.Conv1d",
- "__main__.AttentionLayer"
How to fix it?
101 MB
Upload itransformer1.pt