informer有四个输入
batch_x = batch_x.float().to(self.device)
batch_y = batch_y.float().to(self.device)
batch_x_mark = batch_x_mark.float().to(self.device)
batch_y_mark = batch_y_mark.float().to(self.device)
dec_inp = torch.zeros_like(batch_y[:, -self.args.pred_len:, :]).float()
dec_inp = torch.cat([batch_y[:, :self.args.label_len, :], dec_inp], dim=1).float().to(self.device)
input_names = ('x_enc', 'x_mark_enc', 'x_dec', 'x_mark_dec')
model_input = (batch_x, batch_x_mark, dec_inp, batch_y_mark)
torch.onnx.export(self.model, model_input, f='model.onnx', input_names=input_names)
感谢大家,astype报错解决了,现在报错看不懂了:
torch.onnx.errors.SymbolicValueError: ONNX symbolic expected a constant value of the 'high' argument, got '209 defined in (%209 : int[] = prim::ListConstruct(%182, %208), scope: models.Informer.Model::/layers.Transformer_EncDec.Encoder::encoder/layers.Transformer_EncDec.EncoderLayer::attn_layers.0/layers.SelfAttention_Family.AttentionLayer::attention/layers.SelfAttention_Family.ProbAttention::inner_attention
)' [Caused by the value '209 defined in (%209 : int[] = prim::ListConstruct(%182, %208), scope: models.Informer.Model::/layers.Transformer_EncDec.Encoder::encoder/layers.Transformer_EncDec.EncoderLayer::attn_layers.0/layers.SelfAttention_Family.AttentionLayer::attention/layers.SelfAttention_Family.ProbAttention::inner_attention
)' (type 'List[int]') in the TorchScript graph. The containing node has kind 'prim::ListConstruct'.]
Inputs:
#0: 182 defined in (%182 : Long(requires_grad=0, device=cpu) = onnx::Constant[value={36}](), scope: models.Informer.Model::/layers.Transformer_EncDec.Encoder::encoder/layers.Transformer_EncDec.EncoderLayer::attn_layers.0/layers.SelfAttention_Family.AttentionLayer::attention/layers.SelfAttention_Family.ProbAttention::inner_attention # /home/ljh/test/FEDformer-master/layers/SelfAttention_Family.py:56:0
) (type 'Tensor')
#1: 208 defined in (%208 : Long(device=cpu) = onnx::Constant[value={12}](), scope: models.Informer.Model::/layers.Transformer_EncDec.Encoder::encoder/layers.Transformer_EncDec.EncoderLayer::attn_layers.0/layers.SelfAttention_Family.AttentionLayer::attention/layers.SelfAttention_Family.ProbAttention::inner_attention
) (type 'Tensor')
Outputs:
#0: 209 defined in (%209 : int[] = prim::ListConstruct(%182, %208), scope: models.Informer.Model::/layers.Transformer_EncDec.Encoder::encoder/layers.Transformer_EncDec.EncoderLayer::attn_layers.0/layers.SelfAttention_Family.AttentionLayer::attention/layers.SelfAttention_Family.ProbAttention::inner_attention
) (type 'List[int]')