-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
module: onnxRelated to torch.onnxRelated to torch.onnx
Description
Using torch 0.4.0a0+408c84d, tracing code adapted from #4200
F.log_softmaxis interpreted as:
%29 : Float(13, 10) = Softmax[axis=1](%28) <--missing scopename
%30 : Float(13, 10) = Log(%29), scope: Net1
- Tracing result of VGG19/Alexnet from torchvision model: The scope of conv2d layers is
ReLU[x]instead ofconv2d[x]
alexnet:
%17 : Float(1, 64, 55, 55) = Conv[dilations=[1, 1], group=1, kernel_shape=[11, 11], pads=[2, 2, 2, 2], strides=[4, 4]](%0, %1, %2), scope: AlexNet/Sequential[features]/**ReLU[1]**
%18 : Float(1, 64, 55, 55) = Relu(%17), scope: AlexNet/Sequential[features]/ReLU[1]
%19 : Float(1, 64, 27, 27) = MaxPool[kernel_shape=[3, 3], pads=[0, 0, 0, 0], strides=[2, 2]](%18), scope: AlexNet/Sequential[features]/MaxPool2d[2]
%20 : Float(1, 192, 27, 27) = Conv[dilations=[1, 1], group=1, kernel_shape=[5, 5], pads=[2, 2, 2, 2], strides=[1, 1]](%19, %3, %4), scope: AlexNet/Sequential[features]/**ReLU[4]**
%21 : Float(1, 192, 27, 27) = Relu(%20), scope: AlexNet/Sequential[features]/ReLU[4]
%22 : Float(1, 192, 13, 13) = MaxPool[kernel_shape=[3, 3], pads=[0, 0, 0, 0], strides=[2, 2]](%21), scope: AlexNet/Sequential[features]/MaxPool2d[5]
vgg19:
%39 : Float(1, 64, 224, 224) = Conv[dilations=[1, 1], group=1, kernel_shape=[3, 3], pads=[1, 1, 1, 1], strides=[1, 1]](%0, %1, %2), scope: VGG/Sequential[features]/ReLU[1]
%40 : Float(1, 64, 224, 224) = Relu(%39), scope: VGG/Sequential[features]/ReLU[1]
%41 : Float(1, 64, 224, 224) = Conv[dilations=[1, 1], group=1, kernel_shape=[3, 3], pads=[1, 1, 1, 1], strides=[1, 1]](%40, %3, %4), scope: VGG/Sequential[features]/ReLU[3]
%42 : Float(1, 64, 224, 224) = Relu(%41), scope: VGG/Sequential[features]/ReLU[3]
%43 : Float(1, 64, 112, 112) = MaxPool[kernel_shape=[2, 2], pads=[0, 0, 0, 0], strides=[2, 2]](%42), scope: VGG/Sequential[features]/MaxPool2d[4]
%44 : Float(1, 128, 112, 112) = Conv[dilations=[1, 1], group=1, kernel_shape=[3, 3], pads=[1, 1, 1, 1], strides=[1, 1]](%43, %5, %6), scope: VGG/Sequential[features]/ReLU[6]
%45 : Float(1, 128, 112, 112) = Relu(%44), scope: VGG/Sequential[features]/ReLU[6]
%46 : Float(1, 128, 112, 112) = Conv[dilations=[1, 1], group=1, kernel_shape=[3, 3], pads=[1, 1, 1, 1], strides=[1, 1]](%45, %7, %8), scope: VGG/Sequential[features]/ReLU[8]
%47 : Float(1, 128, 112, 112) = Relu(%46), scope: VGG/Sequential[features]/ReLU[8]
%48 : Float(1, 128, 56, 56) = MaxPool[kernel_shape=[2, 2], pads=[0, 0, 0, 0], strides=[2, 2]](%47), scope: VGG/Sequential[features]/MaxPool2d[9]
- [debug hint?] Tracing result of Resnet can be fixed if I rewrite
self.reluas functional (F.relu())
http://35.197.26.245:6006/#graphs
Thanks
Metadata
Metadata
Labels
module: onnxRelated to torch.onnxRelated to torch.onnx