weixin_39726044
weixin_39726044
2020-12-27 10:24

From channels_first to channels_last?

Is it possible to convert model from channels_first to channels_last(default Tensorflow tensor dim ordering)?

该提问来源于开源项目:nerox8664/pytorch2keras

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

6条回答

  • weixin_39726044 weixin_39726044 4月前

    Ok, I see example here

    However I get about the same error like here

    
    /Users/myuser/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
      from ._conv import register_converters as _register_converters
    Using TensorFlow backend.
    Traceback (most recent call last):
      File "resnet18_channels_last.py", line 19, in <module>
        k_model = pytorch_to_keras(model, input_var, (3, 224, 224,), verbose=True, change_ordering=True)
      File "/Users/myuser/anaconda3/lib/python3.6/site-packages/pytorch2keras/converter.py", line 98, in pytorch_to_keras
        trace.set_graph(_optimize_graph(trace.graph(), False))
      File "/Users/myuser/anaconda3/lib/python3.6/site-packages/pytorch2keras/converter.py", line 43, in _optimize_graph
        graph = torch._C._jit_pass_onnx(graph, aten)
    TypeError: _jit_pass_onnx(): incompatible function arguments. The following argument types are supported:
        1. (arg0: torch::jit::Graph, arg1: torch._C._onnx.OperatorExportTypes) -> torch::jit::Graph
    
    Invoked with: graph(%0 : Float(1, 3, 224, 224)
          %1 : Float(64, 3, 7, 7)
          %2 : Float(64)
          %3 : Float(64)
          %4 : Float(64)
          %5 : Float(64)
          %6 : Long()
          %7 : Float(64, 64, 3, 3)
          %8 : Float(64)
          %9 : Float(64)
          %10 : Float(64)
          %11 : Float(64)
          %12 : Long()
          %13 : Float(64, 64, 3, 3)
          %14 : Float(64)
          %15 : Float(64)
          %16 : Float(64)
          %17 : Float(64)
          %18 : Long()
          %19 : Float(64, 64, 3, 3)
          %20 : Float(64)
          %21 : Float(64)
          %22 : Float(64)
          %23 : Float(64)
          %24 : Long()
          %25 : Float(64, 64, 3, 3)
          %26 : Float(64)
          %27 : Float(64)
          %28 : Float(64)
          %29 : Float(64)
          %30 : Long()
          %31 : Float(128, 64, 3, 3)
          %32 : Float(128)
          %33 : Float(128)
          %34 : Float(128)
          %35 : Float(128)
          %36 : Long()
          %37 : Float(128, 128, 3, 3)
          %38 : Float(128)
          %39 : Float(128)
          %40 : Float(128)
          %41 : Float(128)
          %42 : Long()
          %43 : Float(128, 64, 1, 1)
          %44 : Float(128)
          %45 : Float(128)
          %46 : Float(128)
          %47 : Float(128)
          %48 : Long()
          %49 : Float(128, 128, 3, 3)
          %50 : Float(128)
          %51 : Float(128)
          %52 : Float(128)
          %53 : Float(128)
          %54 : Long()
          %55 : Float(128, 128, 3, 3)
          %56 : Float(128)
          %57 : Float(128)
          %58 : Float(128)
          %59 : Float(128)
          %60 : Long()
          %61 : Float(256, 128, 3, 3)
          %62 : Float(256)
          %63 : Float(256)
          %64 : Float(256)
          %65 : Float(256)
          %66 : Long()
          %67 : Float(256, 256, 3, 3)
          %68 : Float(256)
          %69 : Float(256)
          %70 : Float(256)
          %71 : Float(256)
          %72 : Long()
          %73 : Float(256, 128, 1, 1)
          %74 : Float(256)
          %75 : Float(256)
          %76 : Float(256)
          %77 : Float(256)
          %78 : Long()
          %79 : Float(256, 256, 3, 3)
          %80 : Float(256)
          %81 : Float(256)
          %82 : Float(256)
          %83 : Float(256)
          %84 : Long()
          %85 : Float(256, 256, 3, 3)
          %86 : Float(256)
          %87 : Float(256)
          %88 : Float(256)
          %89 : Float(256)
          %90 : Long()
          %91 : Float(512, 256, 3, 3)
          %92 : Float(512)
          %93 : Float(512)
          %94 : Float(512)
          %95 : Float(512)
          %96 : Long()
          %97 : Float(512, 512, 3, 3)
          %98 : Float(512)
          %99 : Float(512)
          %100 : Float(512)
          %101 : Float(512)
          %102 : Long()
          %103 : Float(512, 256, 1, 1)
          %104 : Float(512)
          %105 : Float(512)
          %106 : Float(512)
          %107 : Float(512)
          %108 : Long()
          %109 : Float(512, 512, 3, 3)
          %110 : Float(512)
          %111 : Float(512)
          %112 : Float(512)
          %113 : Float(512)
          %114 : Long()
          %115 : Float(512, 512, 3, 3)
          %116 : Float(512)
          %117 : Float(512)
          %118 : Float(512)
          %119 : Float(512)
          %120 : Long()
          %121 : Float(1000, 512)
          %122 : Float(1000)) {
      %123 : Dynamic = prim::Undefined(), scope: ResNet/Conv2d[conv1]
      %132 : Float(1, 64, 112, 112) = aten::_convolution[stride=[2, 2], padding=[3, 3], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%0, %1, %123), scope: ResNet/Conv2d[conv1]
      %137 : Float(1, 64, 112, 112) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%132, %2, %3, %4, %5), scope: ResNet/BatchNorm2d[bn1]
      %139 : Float(1, 64, 112, 112) = aten::threshold[threshold={0}, value={0}](%137), scope: ResNet/ReLU[relu]
      %142 : Float(1, 64, 56, 56), %143 : Long(1, 64, 56, 56) = aten::max_pool2d_with_indices[kernel_size=[3, 3], stride=[2, 2], padding=[1, 1], dilation=[1, 1], ceil_mode=0](%139), scope: ResNet/MaxPool2d[maxpool]
      %144 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer1]/BasicBlock[0]/Conv2d[conv1]
      %153 : Float(1, 64, 56, 56) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%142, %7, %144), scope: ResNet/Sequential[layer1]/BasicBlock[0]/Conv2d[conv1]
      %158 : Float(1, 64, 56, 56) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%153, %8, %9, %10, %11), scope: ResNet/Sequential[layer1]/BasicBlock[0]/BatchNorm2d[bn1]
      %160 : Float(1, 64, 56, 56) = aten::threshold[threshold={0}, value={0}](%158), scope: ResNet/Sequential[layer1]/BasicBlock[0]/ReLU[relu]
      %161 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer1]/BasicBlock[0]/Conv2d[conv2]
      %170 : Float(1, 64, 56, 56) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%160, %13, %161), scope: ResNet/Sequential[layer1]/BasicBlock[0]/Conv2d[conv2]
      %175 : Float(1, 64, 56, 56) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%170, %14, %15, %16, %17), scope: ResNet/Sequential[layer1]/BasicBlock[0]/BatchNorm2d[bn2]
      %176 : Float(1, 64, 56, 56) = aten::add[alpha={1}](%175, %142), scope: ResNet/Sequential[layer1]/BasicBlock[0]
      %178 : Float(1, 64, 56, 56) = aten::threshold[threshold={0}, value={0}](%176), scope: ResNet/Sequential[layer1]/BasicBlock[0]/ReLU[relu]
      %179 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer1]/BasicBlock[1]/Conv2d[conv1]
      %188 : Float(1, 64, 56, 56) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%178, %19, %179), scope: ResNet/Sequential[layer1]/BasicBlock[1]/Conv2d[conv1]
      %193 : Float(1, 64, 56, 56) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%188, %20, %21, %22, %23), scope: ResNet/Sequential[layer1]/BasicBlock[1]/BatchNorm2d[bn1]
      %195 : Float(1, 64, 56, 56) = aten::threshold[threshold={0}, value={0}](%193), scope: ResNet/Sequential[layer1]/BasicBlock[1]/ReLU[relu]
      %196 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer1]/BasicBlock[1]/Conv2d[conv2]
      %205 : Float(1, 64, 56, 56) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%195, %25, %196), scope: ResNet/Sequential[layer1]/BasicBlock[1]/Conv2d[conv2]
      %210 : Float(1, 64, 56, 56) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%205, %26, %27, %28, %29), scope: ResNet/Sequential[layer1]/BasicBlock[1]/BatchNorm2d[bn2]
      %211 : Float(1, 64, 56, 56) = aten::add[alpha={1}](%210, %178), scope: ResNet/Sequential[layer1]/BasicBlock[1]
      %213 : Float(1, 64, 56, 56) = aten::threshold[threshold={0}, value={0}](%211), scope: ResNet/Sequential[layer1]/BasicBlock[1]/ReLU[relu]
      %214 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Conv2d[conv1]
      %223 : Float(1, 128, 28, 28) = aten::_convolution[stride=[2, 2], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%213, %31, %214), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Conv2d[conv1]
      %228 : Float(1, 128, 28, 28) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%223, %32, %33, %34, %35), scope: ResNet/Sequential[layer2]/BasicBlock[0]/BatchNorm2d[bn1]
      %230 : Float(1, 128, 28, 28) = aten::threshold[threshold={0}, value={0}](%228), scope: ResNet/Sequential[layer2]/BasicBlock[0]/ReLU[relu]
      %231 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Conv2d[conv2]
      %240 : Float(1, 128, 28, 28) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%230, %37, %231), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Conv2d[conv2]
      %245 : Float(1, 128, 28, 28) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%240, %38, %39, %40, %41), scope: ResNet/Sequential[layer2]/BasicBlock[0]/BatchNorm2d[bn2]
      %246 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Sequential[downsample]/Conv2d[0]
      %255 : Float(1, 128, 28, 28) = aten::_convolution[stride=[2, 2], padding=[0, 0], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%213, %43, %246), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Sequential[downsample]/Conv2d[0]
      %260 : Float(1, 128, 28, 28) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%255, %44, %45, %46, %47), scope: ResNet/Sequential[layer2]/BasicBlock[0]/Sequential[downsample]/BatchNorm2d[1]
      %261 : Float(1, 128, 28, 28) = aten::add[alpha={1}](%245, %260), scope: ResNet/Sequential[layer2]/BasicBlock[0]
      %263 : Float(1, 128, 28, 28) = aten::threshold[threshold={0}, value={0}](%261), scope: ResNet/Sequential[layer2]/BasicBlock[0]/ReLU[relu]
      %264 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer2]/BasicBlock[1]/Conv2d[conv1]
      %273 : Float(1, 128, 28, 28) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%263, %49, %264), scope: ResNet/Sequential[layer2]/BasicBlock[1]/Conv2d[conv1]
      %278 : Float(1, 128, 28, 28) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%273, %50, %51, %52, %53), scope: ResNet/Sequential[layer2]/BasicBlock[1]/BatchNorm2d[bn1]
      %280 : Float(1, 128, 28, 28) = aten::threshold[threshold={0}, value={0}](%278), scope: ResNet/Sequential[layer2]/BasicBlock[1]/ReLU[relu]
      %281 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer2]/BasicBlock[1]/Conv2d[conv2]
      %290 : Float(1, 128, 28, 28) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%280, %55, %281), scope: ResNet/Sequential[layer2]/BasicBlock[1]/Conv2d[conv2]
      %295 : Float(1, 128, 28, 28) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%290, %56, %57, %58, %59), scope: ResNet/Sequential[layer2]/BasicBlock[1]/BatchNorm2d[bn2]
      %296 : Float(1, 128, 28, 28) = aten::add[alpha={1}](%295, %263), scope: ResNet/Sequential[layer2]/BasicBlock[1]
      %298 : Float(1, 128, 28, 28) = aten::threshold[threshold={0}, value={0}](%296), scope: ResNet/Sequential[layer2]/BasicBlock[1]/ReLU[relu]
      %299 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Conv2d[conv1]
      %308 : Float(1, 256, 14, 14) = aten::_convolution[stride=[2, 2], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%298, %61, %299), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Conv2d[conv1]
      %313 : Float(1, 256, 14, 14) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%308, %62, %63, %64, %65), scope: ResNet/Sequential[layer3]/BasicBlock[0]/BatchNorm2d[bn1]
      %315 : Float(1, 256, 14, 14) = aten::threshold[threshold={0}, value={0}](%313), scope: ResNet/Sequential[layer3]/BasicBlock[0]/ReLU[relu]
      %316 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Conv2d[conv2]
      %325 : Float(1, 256, 14, 14) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%315, %67, %316), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Conv2d[conv2]
      %330 : Float(1, 256, 14, 14) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%325, %68, %69, %70, %71), scope: ResNet/Sequential[layer3]/BasicBlock[0]/BatchNorm2d[bn2]
      %331 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Sequential[downsample]/Conv2d[0]
      %340 : Float(1, 256, 14, 14) = aten::_convolution[stride=[2, 2], padding=[0, 0], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%298, %73, %331), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Sequential[downsample]/Conv2d[0]
      %345 : Float(1, 256, 14, 14) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%340, %74, %75, %76, %77), scope: ResNet/Sequential[layer3]/BasicBlock[0]/Sequential[downsample]/BatchNorm2d[1]
      %346 : Float(1, 256, 14, 14) = aten::add[alpha={1}](%330, %345), scope: ResNet/Sequential[layer3]/BasicBlock[0]
      %348 : Float(1, 256, 14, 14) = aten::threshold[threshold={0}, value={0}](%346), scope: ResNet/Sequential[layer3]/BasicBlock[0]/ReLU[relu]
      %349 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer3]/BasicBlock[1]/Conv2d[conv1]
      %358 : Float(1, 256, 14, 14) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%348, %79, %349), scope: ResNet/Sequential[layer3]/BasicBlock[1]/Conv2d[conv1]
      %363 : Float(1, 256, 14, 14) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%358, %80, %81, %82, %83), scope: ResNet/Sequential[layer3]/BasicBlock[1]/BatchNorm2d[bn1]
      %365 : Float(1, 256, 14, 14) = aten::threshold[threshold={0}, value={0}](%363), scope: ResNet/Sequential[layer3]/BasicBlock[1]/ReLU[relu]
      %366 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer3]/BasicBlock[1]/Conv2d[conv2]
      %375 : Float(1, 256, 14, 14) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%365, %85, %366), scope: ResNet/Sequential[layer3]/BasicBlock[1]/Conv2d[conv2]
      %380 : Float(1, 256, 14, 14) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%375, %86, %87, %88, %89), scope: ResNet/Sequential[layer3]/BasicBlock[1]/BatchNorm2d[bn2]
      %381 : Float(1, 256, 14, 14) = aten::add[alpha={1}](%380, %348), scope: ResNet/Sequential[layer3]/BasicBlock[1]
      %383 : Float(1, 256, 14, 14) = aten::threshold[threshold={0}, value={0}](%381), scope: ResNet/Sequential[layer3]/BasicBlock[1]/ReLU[relu]
      %384 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Conv2d[conv1]
      %393 : Float(1, 512, 7, 7) = aten::_convolution[stride=[2, 2], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%383, %91, %384), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Conv2d[conv1]
      %398 : Float(1, 512, 7, 7) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%393, %92, %93, %94, %95), scope: ResNet/Sequential[layer4]/BasicBlock[0]/BatchNorm2d[bn1]
      %400 : Float(1, 512, 7, 7) = aten::threshold[threshold={0}, value={0}](%398), scope: ResNet/Sequential[layer4]/BasicBlock[0]/ReLU[relu]
      %401 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Conv2d[conv2]
      %410 : Float(1, 512, 7, 7) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%400, %97, %401), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Conv2d[conv2]
      %415 : Float(1, 512, 7, 7) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%410, %98, %99, %100, %101), scope: ResNet/Sequential[layer4]/BasicBlock[0]/BatchNorm2d[bn2]
      %416 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Sequential[downsample]/Conv2d[0]
      %425 : Float(1, 512, 7, 7) = aten::_convolution[stride=[2, 2], padding=[0, 0], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%383, %103, %416), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Sequential[downsample]/Conv2d[0]
      %430 : Float(1, 512, 7, 7) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%425, %104, %105, %106, %107), scope: ResNet/Sequential[layer4]/BasicBlock[0]/Sequential[downsample]/BatchNorm2d[1]
      %431 : Float(1, 512, 7, 7) = aten::add[alpha={1}](%415, %430), scope: ResNet/Sequential[layer4]/BasicBlock[0]
      %433 : Float(1, 512, 7, 7) = aten::threshold[threshold={0}, value={0}](%431), scope: ResNet/Sequential[layer4]/BasicBlock[0]/ReLU[relu]
      %434 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer4]/BasicBlock[1]/Conv2d[conv1]
      %443 : Float(1, 512, 7, 7) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%433, %109, %434), scope: ResNet/Sequential[layer4]/BasicBlock[1]/Conv2d[conv1]
      %448 : Float(1, 512, 7, 7) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%443, %110, %111, %112, %113), scope: ResNet/Sequential[layer4]/BasicBlock[1]/BatchNorm2d[bn1]
      %450 : Float(1, 512, 7, 7) = aten::threshold[threshold={0}, value={0}](%448), scope: ResNet/Sequential[layer4]/BasicBlock[1]/ReLU[relu]
      %451 : Dynamic = prim::Undefined(), scope: ResNet/Sequential[layer4]/BasicBlock[1]/Conv2d[conv2]
      %460 : Float(1, 512, 7, 7) = aten::_convolution[stride=[1, 1], padding=[1, 1], dilation=[1, 1], transposed=0, output_padding=[0, 0], groups=1, benchmark=0, deterministic=0, cudnn_enabled=1](%450, %115, %451), scope: ResNet/Sequential[layer4]/BasicBlock[1]/Conv2d[conv2]
      %465 : Float(1, 512, 7, 7) = aten::batch_norm[training=0, momentum=0, eps=1e-05, cudnn_enabled=1](%460, %116, %117, %118, %119), scope: ResNet/Sequential[layer4]/BasicBlock[1]/BatchNorm2d[bn2]
      %466 : Float(1, 512, 7, 7) = aten::add[alpha={1}](%465, %433), scope: ResNet/Sequential[layer4]/BasicBlock[1]
      %468 : Float(1, 512, 7, 7) = aten::threshold[threshold={0}, value={0}](%466), scope: ResNet/Sequential[layer4]/BasicBlock[1]/ReLU[relu]
      %470 : Float(1, 512, 1, 1) = aten::avg_pool2d[kernel_size=[7, 7], stride=[1, 1], padding=[0, 0], ceil_mode=0, count_include_pad=1](%468), scope: ResNet/AvgPool2d[avgpool]
      %471 : Long() = aten::size[dim=0](%470), scope: ResNet
      %472 : Long() = prim::Constant[value={-1}](), scope: ResNet
      %473 : Dynamic = aten::stack[dim=0](%471, %472), scope: ResNet
      %474 : Float(1, 512) = aten::view(%470, %473), scope: ResNet
      %475 : Float(512!, 1000!) = aten::t(%121), scope: ResNet/Linear[fc]
      %476 : Float(1, 1000) = aten::expand[size=[1, 1000], implicit=1](%122), scope: ResNet/Linear[fc]
      %477 : Float(1, 1000) = aten::addmm[beta={1}, alpha={1}](%476, %474, %475), scope: ResNet/Linear[fc]
      return (%477);
    }
    , False
    </module>
    点赞 评论 复制链接分享
  • weixin_39726044 weixin_39726044 4月前

    It was an issue that onnx was incompatible with pytorch 0.4.1, I downgraded to 0.4.0 and it works.

    However seems conversion is run in infinite loop.

    Should it stop after terminal layer?

    
    graph node: ResNet/Linear[fc]
    type: onnx::Gemm
    inputs: ['170']
    outputs: ['ResNet/Linear[fc]']
    name in state_dict: fc
    attrs: {'alpha': 1.0, 'beta': 1.0, 'broadcast': 1, 'transB': 1}
    is_terminal: True
    Converting Linear ...
    (1, 1000) (1, 1000)
    8.34465e-07
    
    点赞 评论 复制链接分享
  • weixin_39726044 weixin_39726044 4月前

    Looks like it runned 10 times, why is it needed? https://github.com/nerox8664/pytorch2keras/blob/master/tests/resnet18_channels_last.py#L10

    点赞 评论 复制链接分享
  • weixin_39726044 weixin_39726044 4月前

    Seems model is converted successfully.

    I wonder why it have such crazy layer names?

    model.summary():

    
    /Users/myuser/anaconda3/lib/python3.6/site-packages/keras/engine/saving.py:269: UserWarning: No training configuration found in save file: the model was *not* compiled. Compile it manually.
      warnings.warn('No training configuration found in save file: '
    __________________________________________________________________________________________________
    Layer (type)                    Output Shape         Param #     Connected to
    ==================================================================================================
    input0 (InputLayer)             (None, 224, 224, 3)  0
    __________________________________________________________________________________________________
    conv10.6813332953790028_pad (Ze (None, 230, 230, 3)  0           input0[0][0]
    __________________________________________________________________________________________________
    conv10.6813332953790028 (Conv2D (None, 112, 112, 64) 9408        conv10.6813332953790028_pad[0][0]
    __________________________________________________________________________________________________
    bn10.8727991225242661 (BatchNor (None, 112, 112, 64) 256         conv10.6813332953790028[0][0]
    __________________________________________________________________________________________________
    relu0.0762975389489895 (Activat (None, 112, 112, 64) 0           bn10.8727991225242661[0][0]
    __________________________________________________________________________________________________
    maxpool0.46401237025593167_pad  (None, 114, 114, 64) 0           relu0.0762975389489895[0][0]
    __________________________________________________________________________________________________
    maxpool0.46401237025593167 (Max (None, 56, 56, 64)   0           maxpool0.46401237025593167_pad[0]
    __________________________________________________________________________________________________
    layer1.0.conv10.931994928332444 (None, 58, 58, 64)   0           maxpool0.46401237025593167[0][0]
    __________________________________________________________________________________________________
    layer1.0.conv10.931994928332444 (None, 56, 56, 64)   36864       layer1.0.conv10.931994928332444_p
    __________________________________________________________________________________________________
    layer1.0.bn10.33003393587213026 (None, 56, 56, 64)   256         layer1.0.conv10.931994928332444[0
    __________________________________________________________________________________________________
    layer1.0.relu0.5105342845232043 (None, 56, 56, 64)   0           layer1.0.bn10.33003393587213026[0
    __________________________________________________________________________________________________
    layer1.0.conv20.857235570511130 (None, 58, 58, 64)   0           layer1.0.relu0.5105342845232043[0
    __________________________________________________________________________________________________
    layer1.0.conv20.857235570511130 (None, 56, 56, 64)   36864       layer1.0.conv20.8572355705111306_
    __________________________________________________________________________________________________
    layer1.0.bn20.02824422739903831 (None, 56, 56, 64)   256         layer1.0.conv20.8572355705111306[
    __________________________________________________________________________________________________
    layer1.00.642419769744416 (Add) (None, 56, 56, 64)   0           layer1.0.bn20.028244227399038313[
                                                                     maxpool0.46401237025593167[0][0]
    __________________________________________________________________________________________________
    layer1.0.relu0.6768286757175926 (None, 56, 56, 64)   0           layer1.00.642419769744416[0][0]
    __________________________________________________________________________________________________
    layer1.1.conv10.787209419834812 (None, 58, 58, 64)   0           layer1.0.relu0.6768286757175926[0
    __________________________________________________________________________________________________
    layer1.1.conv10.787209419834812 (None, 56, 56, 64)   36864       layer1.1.conv10.7872094198348123_
    __________________________________________________________________________________________________
    layer1.1.bn10.5452208350720054  (None, 56, 56, 64)   256         layer1.1.conv10.7872094198348123[
    __________________________________________________________________________________________________
    layer1.1.relu0.6776133470926834 (None, 56, 56, 64)   0           layer1.1.bn10.5452208350720054[0]
    __________________________________________________________________________________________________
    layer1.1.conv20.188700609480209 (None, 58, 58, 64)   0           layer1.1.relu0.6776133470926834[0
    __________________________________________________________________________________________________
    layer1.1.conv20.188700609480209 (None, 56, 56, 64)   36864       layer1.1.conv20.1887006094802094_
    __________________________________________________________________________________________________
    layer1.1.bn20.23391578259868384 (None, 56, 56, 64)   256         layer1.1.conv20.1887006094802094[
    __________________________________________________________________________________________________
    layer1.10.48013268723104563 (Ad (None, 56, 56, 64)   0           layer1.1.bn20.23391578259868384[0
                                                                     layer1.0.relu0.6768286757175926[0
    __________________________________________________________________________________________________
    layer1.1.relu0.7082533448501714 (None, 56, 56, 64)   0           layer1.10.48013268723104563[0][0]
    __________________________________________________________________________________________________
    layer2.0.conv10.994322120898686 (None, 58, 58, 64)   0           layer1.1.relu0.7082533448501714[0
    __________________________________________________________________________________________________
    layer2.0.conv10.994322120898686 (None, 28, 28, 128)  73728       layer2.0.conv10.9943221208986862_
    __________________________________________________________________________________________________
    layer2.0.bn10.9709402516445216  (None, 28, 28, 128)  512         layer2.0.conv10.9943221208986862[
    __________________________________________________________________________________________________
    layer2.0.relu0.1767392069315458 (None, 28, 28, 128)  0           layer2.0.bn10.9709402516445216[0]
    __________________________________________________________________________________________________
    layer2.0.conv20.879285591476006 (None, 30, 30, 128)  0           layer2.0.relu0.17673920693154588[
    __________________________________________________________________________________________________
    layer2.0.conv20.879285591476006 (None, 28, 28, 128)  147456      layer2.0.conv20.8792855914760066_
    __________________________________________________________________________________________________
    layer2.0.downsample.00.47938702 (None, 28, 28, 128)  8192        layer1.1.relu0.7082533448501714[0
    __________________________________________________________________________________________________
    layer2.0.bn20.7672648881198495  (None, 28, 28, 128)  512         layer2.0.conv20.8792855914760066[
    __________________________________________________________________________________________________
    layer2.0.downsample.10.84267711 (None, 28, 28, 128)  512         layer2.0.downsample.00.4793870240
    __________________________________________________________________________________________________
    layer2.00.5234198561543427 (Add (None, 28, 28, 128)  0           layer2.0.bn20.7672648881198495[0]
                                                                     layer2.0.downsample.10.8426771105
    __________________________________________________________________________________________________
    layer2.0.relu0.3510940582260176 (None, 28, 28, 128)  0           layer2.00.5234198561543427[0][0]
    __________________________________________________________________________________________________
    layer2.1.conv10.338914760139542 (None, 30, 30, 128)  0           layer2.0.relu0.3510940582260176[0
    __________________________________________________________________________________________________
    layer2.1.conv10.338914760139542 (None, 28, 28, 128)  147456      layer2.1.conv10.3389147601395428_
    __________________________________________________________________________________________________
    layer2.1.bn10.06486537809156145 (None, 28, 28, 128)  512         layer2.1.conv10.3389147601395428[
    __________________________________________________________________________________________________
    layer2.1.relu0.4453902932356591 (None, 28, 28, 128)  0           layer2.1.bn10.06486537809156145[0
    __________________________________________________________________________________________________
    layer2.1.conv20.417673796744111 (None, 30, 30, 128)  0           layer2.1.relu0.44539029323565915[
    __________________________________________________________________________________________________
    layer2.1.conv20.417673796744111 (None, 28, 28, 128)  147456      layer2.1.conv20.4176737967441114_
    __________________________________________________________________________________________________
    layer2.1.bn20.6070509547639543  (None, 28, 28, 128)  512         layer2.1.conv20.4176737967441114[
    __________________________________________________________________________________________________
    layer2.10.28343579008767805 (Ad (None, 28, 28, 128)  0           layer2.1.bn20.6070509547639543[0]
                                                                     layer2.0.relu0.3510940582260176[0
    __________________________________________________________________________________________________
    layer2.1.relu0.0302224200621794 (None, 28, 28, 128)  0           layer2.10.28343579008767805[0][0]
    __________________________________________________________________________________________________
    layer3.0.conv10.528574815532840 (None, 30, 30, 128)  0           layer2.1.relu0.030222420062179456
    __________________________________________________________________________________________________
    layer3.0.conv10.528574815532840 (None, 14, 14, 256)  294912      layer3.0.conv10.5285748155328408_
    __________________________________________________________________________________________________
    layer3.0.bn10.6447357510605776  (None, 14, 14, 256)  1024        layer3.0.conv10.5285748155328408[
    __________________________________________________________________________________________________
    layer3.0.relu0.9495236199845462 (None, 14, 14, 256)  0           layer3.0.bn10.6447357510605776[0]
    __________________________________________________________________________________________________
    layer3.0.conv20.637238328326483 (None, 16, 16, 256)  0           layer3.0.relu0.9495236199845462[0
    __________________________________________________________________________________________________
    layer3.0.conv20.637238328326483 (None, 14, 14, 256)  589824      layer3.0.conv20.6372383283264834_
    __________________________________________________________________________________________________
    layer3.0.downsample.00.56558425 (None, 14, 14, 256)  32768       layer2.1.relu0.030222420062179456
    __________________________________________________________________________________________________
    layer3.0.bn20.5106677624083356  (None, 14, 14, 256)  1024        layer3.0.conv20.6372383283264834[
    __________________________________________________________________________________________________
    layer3.0.downsample.10.44508294 (None, 14, 14, 256)  1024        layer3.0.downsample.00.5655842566
    __________________________________________________________________________________________________
    layer3.00.7041955783493549 (Add (None, 14, 14, 256)  0           layer3.0.bn20.5106677624083356[0]
                                                                     layer3.0.downsample.10.4450829471
    __________________________________________________________________________________________________
    layer3.0.relu0.3397199519808391 (None, 14, 14, 256)  0           layer3.00.7041955783493549[0][0]
    __________________________________________________________________________________________________
    layer3.1.conv10.984057373353050 (None, 16, 16, 256)  0           layer3.0.relu0.33971995198083915[
    __________________________________________________________________________________________________
    layer3.1.conv10.984057373353050 (None, 14, 14, 256)  589824      layer3.1.conv10.9840573733530503_
    __________________________________________________________________________________________________
    layer3.1.bn10.22112780918124042 (None, 14, 14, 256)  1024        layer3.1.conv10.9840573733530503[
    __________________________________________________________________________________________________
    layer3.1.relu0.294045747775604  (None, 14, 14, 256)  0           layer3.1.bn10.22112780918124042[0
    __________________________________________________________________________________________________
    layer3.1.conv20.120659910516747 (None, 16, 16, 256)  0           layer3.1.relu0.294045747775604[0]
    __________________________________________________________________________________________________
    layer3.1.conv20.120659910516747 (None, 14, 14, 256)  589824      layer3.1.conv20.12065991051674774
    __________________________________________________________________________________________________
    layer3.1.bn20.768759930030139 ( (None, 14, 14, 256)  1024        layer3.1.conv20.12065991051674774
    __________________________________________________________________________________________________
    layer3.10.34664998770988364 (Ad (None, 14, 14, 256)  0           layer3.1.bn20.768759930030139[0][
                                                                     layer3.0.relu0.33971995198083915[
    __________________________________________________________________________________________________
    layer3.1.relu0.4939033306939159 (None, 14, 14, 256)  0           layer3.10.34664998770988364[0][0]
    __________________________________________________________________________________________________
    layer4.0.conv10.185602965073437 (None, 16, 16, 256)  0           layer3.1.relu0.4939033306939159[0
    __________________________________________________________________________________________________
    layer4.0.conv10.185602965073437 (None, 7, 7, 512)    1179648     layer4.0.conv10.18560296507343776
    __________________________________________________________________________________________________
    layer4.0.bn10.27108769093698404 (None, 7, 7, 512)    2048        layer4.0.conv10.18560296507343776
    __________________________________________________________________________________________________
    layer4.0.relu0.808376356687133  (None, 7, 7, 512)    0           layer4.0.bn10.27108769093698404[0
    __________________________________________________________________________________________________
    layer4.0.conv20.763456491008416 (None, 9, 9, 512)    0           layer4.0.relu0.808376356687133[0]
    __________________________________________________________________________________________________
    layer4.0.conv20.763456491008416 (None, 7, 7, 512)    2359296     layer4.0.conv20.7634564910084164_
    __________________________________________________________________________________________________
    layer4.0.downsample.00.53280039 (None, 7, 7, 512)    131072      layer3.1.relu0.4939033306939159[0
    __________________________________________________________________________________________________
    layer4.0.bn20.26077576589740015 (None, 7, 7, 512)    2048        layer4.0.conv20.7634564910084164[
    __________________________________________________________________________________________________
    layer4.0.downsample.10.21738573 (None, 7, 7, 512)    2048        layer4.0.downsample.00.5328003907
    __________________________________________________________________________________________________
    layer4.00.6710765419071708 (Add (None, 7, 7, 512)    0           layer4.0.bn20.26077576589740015[0
                                                                     layer4.0.downsample.10.2173857374
    __________________________________________________________________________________________________
    layer4.0.relu0.6607905576857227 (None, 7, 7, 512)    0           layer4.00.6710765419071708[0][0]
    __________________________________________________________________________________________________
    layer4.1.conv10.312131909236327 (None, 9, 9, 512)    0           layer4.0.relu0.6607905576857227[0
    __________________________________________________________________________________________________
    layer4.1.conv10.312131909236327 (None, 7, 7, 512)    2359296     layer4.1.conv10.3121319092363273_
    __________________________________________________________________________________________________
    layer4.1.bn10.00891863344954979 (None, 7, 7, 512)    2048        layer4.1.conv10.3121319092363273[
    __________________________________________________________________________________________________
    layer4.1.relu0.0372878286716447 (None, 7, 7, 512)    0           layer4.1.bn10.008918633449549795[
    __________________________________________________________________________________________________
    layer4.1.conv20.036566393821319 (None, 9, 9, 512)    0           layer4.1.relu0.03728782867164471[
    __________________________________________________________________________________________________
    layer4.1.conv20.036566393821319 (None, 7, 7, 512)    2359296     layer4.1.conv20.03656639382131943
    __________________________________________________________________________________________________
    layer4.1.bn20.21967901246838784 (None, 7, 7, 512)    2048        layer4.1.conv20.03656639382131943
    __________________________________________________________________________________________________
    layer4.10.44936057300402865 (Ad (None, 7, 7, 512)    0           layer4.1.bn20.21967901246838784[0
                                                                     layer4.0.relu0.6607905576857227[0
    __________________________________________________________________________________________________
    layer4.1.relu0.4308629582399543 (None, 7, 7, 512)    0           layer4.10.44936057300402865[0][0]
    __________________________________________________________________________________________________
    avgpool0.819584693782756 (Avera (None, 1, 1, 512)    0           layer4.1.relu0.43086295823995435[
    __________________________________________________________________________________________________
    0.45029683114238983 (Flatten)   (None, 512)          0           avgpool0.819584693782756[0][0]
    __________________________________________________________________________________________________
    fc0.7954072187540818 (Dense)    (None, 1000)         513000      0.45029683114238983[0][0]
    ==================================================================================================
    Total params: 11,699,112
    Trainable params: 11,689,512
    Non-trainable params: 9,600
    __________________________________________________________________________________________________
    None
    
    点赞 评论 复制链接分享
  • weixin_39669163 weixin_39669163 4月前

    Hello there. I run every test 10 (or more times) to check a model conversion with random weights. So, if you want to convert ResNet you don't need to run the test, you may convert it in one iteration. Also, don't forget about the proper (not random) weights. About crazy layer names -- it's possible to get multiple layers with a same name in PyTorch, but it will be wrong for Keras. So, i decided to add random-suffix. If you want to get more compact names, you can use short_names flag.

    点赞 评论 复制链接分享
  • weixin_39669163 weixin_39669163 4月前

    Hello . Yes, you can use flag change_ordering=True to convert a model with channels last mode.

    点赞 评论 复制链接分享

相关推荐