how to change the AlexNet into FCNs ?

 

How to change the AlexNet into FCNs ?

 

FCNs is a network that only contain convolution layers and no fc layer at all. It's structure can be shown as the following figures:

This image from the paper : <Fully Convolutional Networks for Semantic Segmentation>  CVPR 2015.

 

  It could locate the location of object target perfectly as shown in above images and it doesn't need to resize the resolution of input images, which is the mostly different from traditional CNNs. First, Let's review some related network parameters about AlexNet, related structure can be shown as following:

  As we can see from the above figure, the input of images must be resized into a fixed resolution, like 224*224, due to the existance of fc_layer. The specific pipeline could be found in this blog, web link: http://blog.csdn.net/sunbaigui/article/details/39938097

 

  The output of Conv 5 is: 6*6*256, we want to obtain the final results: 1*1*1000 (take the 1k classes for an example). How could we use the middle Conv 6, Conv 7, Conv 8 layers to bridge the two results ? Do we need the pool layers added ? How to set the middle parameters in each layers ? Does it really work ?

 

  Let's do it now. We just add 3 Convolution layers for an example. The function used for change the width*height*channel (actually, it only about the width, due to width == height, and the channel only related to the output of each layer.) is :

(W- F + 2P)/S + 1

where W denotes the width of images from bottom layer, F denotes the size of Convolution filter, P means the padding you want to add, this mainly contribute to the same resolution of input and output, S denotes the stride.

 

  Thus, the following layers needed to add to the prototxt files:

    from: 6*6*256 --->  3*3*4096 ---> 1*1*4096 ---> 1*1*43 (take my experiments for an example.)

####################################################################
## the output of Pool 5 is 6*6*256
####################################################################
layer {
  name: "conv6"
  type: "Convolution"
  bottom: "pool5"
  top: "conv6"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 4096
    kernel_size: 2
    stride: 2
    # group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "conv6"
  top: "conv6"
}

layer {
  name: "conv7"
  type: "Convolution"
  bottom: "conv6"
  top: "conv7"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 4096
    kernel_size: 3
    stride: 2
    # group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "conv7"
  top: "conv7"
}

layer {
  name: "conv8"
  type: "Convolution"
  bottom: "conv7"
  top: "conv8"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 43
    kernel_size: 1
    stride: 1
    # group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu8"
  type: "ReLU"
  bottom: "conv8"
  top: "conv8"
}

 

  Then, make your caffe file and waiting for something amazing happens...

  Actually, at first, I always run a wrong result, i.e. 2*2*43 ... It really confused me, the function is wrong ? It does not make scene. Because it really worked at the begining of the Network. Lastly, I found I make a stupid mistake, due to I add the Conv 6 from Conv 5, not Pool 5. Thus, it is really important for us to be careful and more careful.

  Ok, the all pipeline has done, and due to my ACER lap-top only have a GTX960M, it warning me out of memory. The results running on the terminal are here :

I0423 09:52:24.421512  2763 caffe.cpp:189] Using GPUs 0
I0423 09:52:24.431041  2763 caffe.cpp:194] GPU 0: GeForce GTX 960M
I0423 09:52:24.565281  2763 solver.cpp:48] Initializing solver from parameters:
test_iter: 7600
test_interval: 2000
base_lr: 0.001
display: 12
max_iter: 450000
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 0.0005
stepsize: 2000
snapshot: 2000
snapshot_prefix: "/media/wangxiao/Acer/caffe_models_/"
solver_mode: GPU
device_id: 0
net: "/home/wangxiao/Downloads/fcn-caffe-master/wangxiao/train_val.prototxt"
test_initialization: false
I0423 09:52:24.621829  2763 solver.cpp:91] Creating training net from net file: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/train_val.prototxt
I0423 09:52:24.622601  2763 net.cpp:313] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
I0423 09:52:24.622632  2763 net.cpp:313] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I0423 09:52:24.622828  2763 net.cpp:49] Initializing net from parameters:
name: "AlexNet"
state {
  phase: TRAIN
}
layer {
  name: "data"
  type: "ImageData"
  top: "data"
  top: "label"
  include {
    phase: TRAIN
  }
  transform_param {
    mirror: false
  }
  image_data_param {
    source: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/train_data/newAdd_attribute_label.txt"
    batch_size: 12
    root_folder: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/train_data/227_227_images/"
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "conv1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "norm1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "conv2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "norm2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv6"
  type: "Convolution"
  bottom: "pool5"
  top: "conv6"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 4096
    kernel_size: 2
    stride: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "conv6"
  top: "conv6"
}
layer {
  name: "conv7"
  type: "Convolution"
  bottom: "conv6"
  top: "conv7"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 4096
    kernel_size: 3
    stride: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "conv7"
  top: "conv7"
}
layer {
  name: "conv8"
  type: "Convolution"
  bottom: "conv7"
  top: "conv8"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 43
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu8"
  type: "ReLU"
  bottom: "conv8"
  top: "conv8"
}
layer {
  name: "sigmoid"
  type: "Sigmoid"
  bottom: "conv8"
  top: "conv8"
}
layer {
  name: "loss"
  type: "EuclideanLoss"
  bottom: "conv8"
  bottom: "label"
  top: "loss"
}
I0423 09:52:24.622962  2763 layer_factory.hpp:77] Creating layer data
I0423 09:52:24.623002  2763 net.cpp:91] Creating Layer data
I0423 09:52:24.623009  2763 net.cpp:399] data -> data
I0423 09:52:24.623034  2763 net.cpp:399] data -> label
I0423 09:52:24.623051  2763 image_data_layer.cpp:40] Opening file /media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/train_data/newAdd_attribute_label.txt
I0423 09:52:25.535037  2763 image_data_layer.cpp:67] A total of 265972 images.
I0423 09:52:25.543112  2763 image_data_layer.cpp:94] output data size: 12,3,227,227
I0423 09:52:25.554397  2763 net.cpp:141] Setting up data
I0423 09:52:25.554425  2763 net.cpp:148] Top shape: 12 3 227 227 (1855044)
I0423 09:52:25.554431  2763 net.cpp:148] Top shape: 12 43 1 1 (516)
I0423 09:52:25.554435  2763 net.cpp:156] Memory required for data: 7422240
I0423 09:52:25.554451  2763 layer_factory.hpp:77] Creating layer conv1
I0423 09:52:25.554476  2763 net.cpp:91] Creating Layer conv1
I0423 09:52:25.554481  2763 net.cpp:425] conv1 <- data
I0423 09:52:25.554492  2763 net.cpp:399] conv1 -> conv1
I0423 09:52:25.556519  2763 net.cpp:141] Setting up conv1
I0423 09:52:25.556534  2763 net.cpp:148] Top shape: 12 96 55 55 (3484800)
I0423 09:52:25.556537  2763 net.cpp:156] Memory required for data: 21361440
I0423 09:52:25.556556  2763 layer_factory.hpp:77] Creating layer relu1
I0423 09:52:25.556565  2763 net.cpp:91] Creating Layer relu1
I0423 09:52:25.556568  2763 net.cpp:425] relu1 <- conv1
I0423 09:52:25.556573  2763 net.cpp:386] relu1 -> conv1 (in-place)
I0423 09:52:25.556583  2763 net.cpp:141] Setting up relu1
I0423 09:52:25.556587  2763 net.cpp:148] Top shape: 12 96 55 55 (3484800)
I0423 09:52:25.556591  2763 net.cpp:156] Memory required for data: 35300640
I0423 09:52:25.556594  2763 layer_factory.hpp:77] Creating layer norm1
I0423 09:52:25.556602  2763 net.cpp:91] Creating Layer norm1
I0423 09:52:25.556604  2763 net.cpp:425] norm1 <- conv1
I0423 09:52:25.556609  2763 net.cpp:399] norm1 -> norm1
I0423 09:52:25.556646  2763 net.cpp:141] Setting up norm1
I0423 09:52:25.556653  2763 net.cpp:148] Top shape: 12 96 55 55 (3484800)
I0423 09:52:25.556689  2763 net.cpp:156] Memory required for data: 49239840
I0423 09:52:25.556692  2763 layer_factory.hpp:77] Creating layer pool1
I0423 09:52:25.556700  2763 net.cpp:91] Creating Layer pool1
I0423 09:52:25.556704  2763 net.cpp:425] pool1 <- norm1
I0423 09:52:25.556710  2763 net.cpp:399] pool1 -> pool1
I0423 09:52:25.556749  2763 net.cpp:141] Setting up pool1
I0423 09:52:25.556766  2763 net.cpp:148] Top shape: 12 96 27 27 (839808)
I0423 09:52:25.556769  2763 net.cpp:156] Memory required for data: 52599072
I0423 09:52:25.556772  2763 layer_factory.hpp:77] Creating layer conv2
I0423 09:52:25.556792  2763 net.cpp:91] Creating Layer conv2
I0423 09:52:25.556795  2763 net.cpp:425] conv2 <- pool1
I0423 09:52:25.556802  2763 net.cpp:399] conv2 -> conv2
I0423 09:52:25.565610  2763 net.cpp:141] Setting up conv2
I0423 09:52:25.565634  2763 net.cpp:148] Top shape: 12 256 27 27 (2239488)
I0423 09:52:25.565637  2763 net.cpp:156] Memory required for data: 61557024
I0423 09:52:25.565651  2763 layer_factory.hpp:77] Creating layer relu2
I0423 09:52:25.565660  2763 net.cpp:91] Creating Layer relu2
I0423 09:52:25.565665  2763 net.cpp:425] relu2 <- conv2
I0423 09:52:25.565672  2763 net.cpp:386] relu2 -> conv2 (in-place)
I0423 09:52:25.565681  2763 net.cpp:141] Setting up relu2
I0423 09:52:25.565686  2763 net.cpp:148] Top shape: 12 256 27 27 (2239488)
I0423 09:52:25.565690  2763 net.cpp:156] Memory required for data: 70514976
I0423 09:52:25.565692  2763 layer_factory.hpp:77] Creating layer norm2
I0423 09:52:25.565699  2763 net.cpp:91] Creating Layer norm2
I0423 09:52:25.565702  2763 net.cpp:425] norm2 <- conv2
I0423 09:52:25.565708  2763 net.cpp:399] norm2 -> norm2
I0423 09:52:25.565742  2763 net.cpp:141] Setting up norm2
I0423 09:52:25.565747  2763 net.cpp:148] Top shape: 12 256 27 27 (2239488)
I0423 09:52:25.565750  2763 net.cpp:156] Memory required for data: 79472928
I0423 09:52:25.565753  2763 layer_factory.hpp:77] Creating layer pool2
I0423 09:52:25.565762  2763 net.cpp:91] Creating Layer pool2
I0423 09:52:25.565764  2763 net.cpp:425] pool2 <- norm2
I0423 09:52:25.565769  2763 net.cpp:399] pool2 -> pool2
I0423 09:52:25.565798  2763 net.cpp:141] Setting up pool2
I0423 09:52:25.565804  2763 net.cpp:148] Top shape: 12 256 13 13 (519168)
I0423 09:52:25.565809  2763 net.cpp:156] Memory required for data: 81549600
I0423 09:52:25.565811  2763 layer_factory.hpp:77] Creating layer conv3
I0423 09:52:25.565821  2763 net.cpp:91] Creating Layer conv3
I0423 09:52:25.565824  2763 net.cpp:425] conv3 <- pool2
I0423 09:52:25.565831  2763 net.cpp:399] conv3 -> conv3
I0423 09:52:25.590066  2763 net.cpp:141] Setting up conv3
I0423 09:52:25.590090  2763 net.cpp:148] Top shape: 12 384 13 13 (778752)
I0423 09:52:25.590092  2763 net.cpp:156] Memory required for data: 84664608
I0423 09:52:25.590116  2763 layer_factory.hpp:77] Creating layer relu3
I0423 09:52:25.590126  2763 net.cpp:91] Creating Layer relu3
I0423 09:52:25.590131  2763 net.cpp:425] relu3 <- conv3
I0423 09:52:25.590137  2763 net.cpp:386] relu3 -> conv3 (in-place)
I0423 09:52:25.590145  2763 net.cpp:141] Setting up relu3
I0423 09:52:25.590149  2763 net.cpp:148] Top shape: 12 384 13 13 (778752)
I0423 09:52:25.590152  2763 net.cpp:156] Memory required for data: 87779616
I0423 09:52:25.590155  2763 layer_factory.hpp:77] Creating layer conv4
I0423 09:52:25.590167  2763 net.cpp:91] Creating Layer conv4
I0423 09:52:25.590169  2763 net.cpp:425] conv4 <- conv3
I0423 09:52:25.590176  2763 net.cpp:399] conv4 -> conv4
I0423 09:52:25.608953  2763 net.cpp:141] Setting up conv4
I0423 09:52:25.608975  2763 net.cpp:148] Top shape: 12 384 13 13 (778752)
I0423 09:52:25.608979  2763 net.cpp:156] Memory required for data: 90894624
I0423 09:52:25.608989  2763 layer_factory.hpp:77] Creating layer relu4
I0423 09:52:25.609007  2763 net.cpp:91] Creating Layer relu4
I0423 09:52:25.609011  2763 net.cpp:425] relu4 <- conv4
I0423 09:52:25.609019  2763 net.cpp:386] relu4 -> conv4 (in-place)
I0423 09:52:25.609027  2763 net.cpp:141] Setting up relu4
I0423 09:52:25.609031  2763 net.cpp:148] Top shape: 12 384 13 13 (778752)
I0423 09:52:25.609047  2763 net.cpp:156] Memory required for data: 94009632
I0423 09:52:25.609050  2763 layer_factory.hpp:77] Creating layer conv5
I0423 09:52:25.609061  2763 net.cpp:91] Creating Layer conv5
I0423 09:52:25.609066  2763 net.cpp:425] conv5 <- conv4
I0423 09:52:25.609071  2763 net.cpp:399] conv5 -> conv5
I0423 09:52:25.621208  2763 net.cpp:141] Setting up conv5
I0423 09:52:25.621229  2763 net.cpp:148] Top shape: 12 256 13 13 (519168)
I0423 09:52:25.621233  2763 net.cpp:156] Memory required for data: 96086304
I0423 09:52:25.621258  2763 layer_factory.hpp:77] Creating layer relu5
I0423 09:52:25.621268  2763 net.cpp:91] Creating Layer relu5
I0423 09:52:25.621273  2763 net.cpp:425] relu5 <- conv5
I0423 09:52:25.621279  2763 net.cpp:386] relu5 -> conv5 (in-place)
I0423 09:52:25.621286  2763 net.cpp:141] Setting up relu5
I0423 09:52:25.621290  2763 net.cpp:148] Top shape: 12 256 13 13 (519168)
I0423 09:52:25.621294  2763 net.cpp:156] Memory required for data: 98162976
I0423 09:52:25.621297  2763 layer_factory.hpp:77] Creating layer pool5
I0423 09:52:25.621304  2763 net.cpp:91] Creating Layer pool5
I0423 09:52:25.621306  2763 net.cpp:425] pool5 <- conv5
I0423 09:52:25.621314  2763 net.cpp:399] pool5 -> pool5
I0423 09:52:25.621347  2763 net.cpp:141] Setting up pool5
I0423 09:52:25.621354  2763 net.cpp:148] Top shape: 12 256 6 6 (110592)
I0423 09:52:25.621357  2763 net.cpp:156] Memory required for data: 98605344
I0423 09:52:25.621361  2763 layer_factory.hpp:77] Creating layer conv6
I0423 09:52:25.621373  2763 net.cpp:91] Creating Layer conv6
I0423 09:52:25.621377  2763 net.cpp:425] conv6 <- pool5
I0423 09:52:25.621384  2763 net.cpp:399] conv6 -> conv6
I0423 09:52:25.731640  2763 net.cpp:141] Setting up conv6
I0423 09:52:25.731675  2763 net.cpp:148] Top shape: 12 4096 3 3 (442368)
I0423 09:52:25.731679  2763 net.cpp:156] Memory required for data: 100374816
I0423 09:52:25.731688  2763 layer_factory.hpp:77] Creating layer relu6
I0423 09:52:25.731709  2763 net.cpp:91] Creating Layer relu6
I0423 09:52:25.731714  2763 net.cpp:425] relu6 <- conv6
I0423 09:52:25.731721  2763 net.cpp:386] relu6 -> conv6 (in-place)
I0423 09:52:25.731731  2763 net.cpp:141] Setting up relu6
I0423 09:52:25.731735  2763 net.cpp:148] Top shape: 12 4096 3 3 (442368)
I0423 09:52:25.731739  2763 net.cpp:156] Memory required for data: 102144288
I0423 09:52:25.731741  2763 layer_factory.hpp:77] Creating layer conv7
I0423 09:52:25.731752  2763 net.cpp:91] Creating Layer conv7
I0423 09:52:25.731757  2763 net.cpp:425] conv7 <- conv6
I0423 09:52:25.731765  2763 net.cpp:399] conv7 -> conv7
I0423 09:52:29.661667  2763 net.cpp:141] Setting up conv7
I0423 09:52:29.661705  2763 net.cpp:148] Top shape: 12 4096 1 1 (49152)
I0423 09:52:29.661710  2763 net.cpp:156] Memory required for data: 102340896
I0423 09:52:29.661720  2763 layer_factory.hpp:77] Creating layer relu7
I0423 09:52:29.661741  2763 net.cpp:91] Creating Layer relu7
I0423 09:52:29.661746  2763 net.cpp:425] relu7 <- conv7
I0423 09:52:29.661752  2763 net.cpp:386] relu7 -> conv7 (in-place)
I0423 09:52:29.661761  2763 net.cpp:141] Setting up relu7
I0423 09:52:29.661767  2763 net.cpp:148] Top shape: 12 4096 1 1 (49152)
I0423 09:52:29.661769  2763 net.cpp:156] Memory required for data: 102537504
I0423 09:52:29.661772  2763 layer_factory.hpp:77] Creating layer conv8
I0423 09:52:29.661783  2763 net.cpp:91] Creating Layer conv8
I0423 09:52:29.661788  2763 net.cpp:425] conv8 <- conv7
I0423 09:52:29.661795  2763 net.cpp:399] conv8 -> conv8
I0423 09:52:29.666793  2763 net.cpp:141] Setting up conv8
I0423 09:52:29.666815  2763 net.cpp:148] Top shape: 12 43 1 1 (516)
I0423 09:52:29.666818  2763 net.cpp:156] Memory required for data: 102539568
I0423 09:52:29.666826  2763 layer_factory.hpp:77] Creating layer relu8
I0423 09:52:29.666841  2763 net.cpp:91] Creating Layer relu8
I0423 09:52:29.666844  2763 net.cpp:425] relu8 <- conv8
I0423 09:52:29.666849  2763 net.cpp:386] relu8 -> conv8 (in-place)
I0423 09:52:29.666856  2763 net.cpp:141] Setting up relu8
I0423 09:52:29.666860  2763 net.cpp:148] Top shape: 12 43 1 1 (516)
I0423 09:52:29.666877  2763 net.cpp:156] Memory required for data: 102541632
I0423 09:52:29.666882  2763 layer_factory.hpp:77] Creating layer sigmoid
I0423 09:52:29.666888  2763 net.cpp:91] Creating Layer sigmoid
I0423 09:52:29.666892  2763 net.cpp:425] sigmoid <- conv8
I0423 09:52:29.666895  2763 net.cpp:386] sigmoid -> conv8 (in-place)
I0423 09:52:29.666901  2763 net.cpp:141] Setting up sigmoid
I0423 09:52:29.666905  2763 net.cpp:148] Top shape: 12 43 1 1 (516)
I0423 09:52:29.666908  2763 net.cpp:156] Memory required for data: 102543696
I0423 09:52:29.666911  2763 layer_factory.hpp:77] Creating layer loss
I0423 09:52:29.666918  2763 net.cpp:91] Creating Layer loss
I0423 09:52:29.666920  2763 net.cpp:425] loss <- conv8
I0423 09:52:29.666924  2763 net.cpp:425] loss <- label
I0423 09:52:29.666931  2763 net.cpp:399] loss -> loss
I0423 09:52:29.666975  2763 net.cpp:141] Setting up loss
I0423 09:52:29.666990  2763 net.cpp:148] Top shape: (1)
I0423 09:52:29.666992  2763 net.cpp:151]     with loss weight 1
I0423 09:52:29.667017  2763 net.cpp:156] Memory required for data: 102543700
I0423 09:52:29.667031  2763 net.cpp:217] loss needs backward computation.
I0423 09:52:29.667034  2763 net.cpp:217] sigmoid needs backward computation.
I0423 09:52:29.667038  2763 net.cpp:217] relu8 needs backward computation.
I0423 09:52:29.667040  2763 net.cpp:217] conv8 needs backward computation.
I0423 09:52:29.667043  2763 net.cpp:217] relu7 needs backward computation.
I0423 09:52:29.667047  2763 net.cpp:217] conv7 needs backward computation.
I0423 09:52:29.667050  2763 net.cpp:217] relu6 needs backward computation.
I0423 09:52:29.667053  2763 net.cpp:217] conv6 needs backward computation.
I0423 09:52:29.667057  2763 net.cpp:217] pool5 needs backward computation.
I0423 09:52:29.667060  2763 net.cpp:217] relu5 needs backward computation.
I0423 09:52:29.667063  2763 net.cpp:217] conv5 needs backward computation.
I0423 09:52:29.667068  2763 net.cpp:217] relu4 needs backward computation.
I0423 09:52:29.667070  2763 net.cpp:217] conv4 needs backward computation.
I0423 09:52:29.667073  2763 net.cpp:217] relu3 needs backward computation.
I0423 09:52:29.667076  2763 net.cpp:217] conv3 needs backward computation.
I0423 09:52:29.667080  2763 net.cpp:217] pool2 needs backward computation.
I0423 09:52:29.667084  2763 net.cpp:217] norm2 needs backward computation.
I0423 09:52:29.667088  2763 net.cpp:217] relu2 needs backward computation.
I0423 09:52:29.667091  2763 net.cpp:217] conv2 needs backward computation.
I0423 09:52:29.667094  2763 net.cpp:217] pool1 needs backward computation.
I0423 09:52:29.667098  2763 net.cpp:217] norm1 needs backward computation.
I0423 09:52:29.667101  2763 net.cpp:217] relu1 needs backward computation.
I0423 09:52:29.667104  2763 net.cpp:217] conv1 needs backward computation.
I0423 09:52:29.667109  2763 net.cpp:219] data does not need backward computation.
I0423 09:52:29.667111  2763 net.cpp:261] This network produces output loss
I0423 09:52:29.667127  2763 net.cpp:274] Network initialization done.
I0423 09:52:29.667804  2763 solver.cpp:181] Creating test net (#0) specified by net file: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/train_val.prototxt
I0423 09:52:29.667937  2763 net.cpp:313] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
I0423 09:52:29.668148  2763 net.cpp:49] Initializing net from parameters:
name: "AlexNet"
state {
  phase: TEST
}
layer {
  name: "data"
  type: "ImageData"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    mirror: false
  }
  image_data_param {
    source: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/test_data/newAdd_attribute_label_test.txt"
    batch_size: 1
    root_folder: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/test_data/227_227_test_images/"
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "conv1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "norm1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "conv2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "norm2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "pool2"
  top: "conv3"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "conv6"
  type: "Convolution"
  bottom: "pool5"
  top: "conv6"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 4096
    kernel_size: 2
    stride: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "conv6"
  top: "conv6"
}
layer {
  name: "conv7"
  type: "Convolution"
  bottom: "conv6"
  top: "conv7"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 4096
    kernel_size: 3
    stride: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "conv7"
  top: "conv7"
}
layer {
  name: "conv8"
  type: "Convolution"
  bottom: "conv7"
  top: "conv8"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 43
    kernel_size: 1
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0.1
    }
  }
}
layer {
  name: "relu8"
  type: "ReLU"
  bottom: "conv8"
  top: "conv8"
}
layer {
  name: "sigmoid"
  type: "Sigmoid"
  bottom: "conv8"
  top: "conv8"
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "conv8"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "EuclideanLoss"
  bottom: "conv8"
  bottom: "label"
  top: "loss"
}
I0423 09:52:29.668323  2763 layer_factory.hpp:77] Creating layer data
I0423 09:52:29.668349  2763 net.cpp:91] Creating Layer data
I0423 09:52:29.668355  2763 net.cpp:399] data -> data
I0423 09:52:29.668373  2763 net.cpp:399] data -> label
I0423 09:52:29.668382  2763 image_data_layer.cpp:40] Opening file /media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/test_data/newAdd_attribute_label_test.txt
I0423 09:52:29.696005  2763 image_data_layer.cpp:67] A total of 7600 images.
I0423 09:52:29.697830  2763 image_data_layer.cpp:94] output data size: 1,3,227,227
I0423 09:52:29.699980  2763 net.cpp:141] Setting up data
I0423 09:52:29.700013  2763 net.cpp:148] Top shape: 1 3 227 227 (154587)
I0423 09:52:29.700019  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:29.700022  2763 net.cpp:156] Memory required for data: 618520
I0423 09:52:29.700028  2763 layer_factory.hpp:77] Creating layer label_data_1_split
I0423 09:52:29.700040  2763 net.cpp:91] Creating Layer label_data_1_split
I0423 09:52:29.700048  2763 net.cpp:425] label_data_1_split <- label
I0423 09:52:29.700060  2763 net.cpp:399] label_data_1_split -> label_data_1_split_0
I0423 09:52:29.700075  2763 net.cpp:399] label_data_1_split -> label_data_1_split_1
I0423 09:52:29.700141  2763 net.cpp:141] Setting up label_data_1_split
I0423 09:52:29.700151  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:29.700160  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:29.700176  2763 net.cpp:156] Memory required for data: 618864
I0423 09:52:29.700181  2763 layer_factory.hpp:77] Creating layer conv1
I0423 09:52:29.700196  2763 net.cpp:91] Creating Layer conv1
I0423 09:52:29.700199  2763 net.cpp:425] conv1 <- data
I0423 09:52:29.700206  2763 net.cpp:399] conv1 -> conv1
I0423 09:52:29.701347  2763 net.cpp:141] Setting up conv1
I0423 09:52:29.701369  2763 net.cpp:148] Top shape: 1 96 55 55 (290400)
I0423 09:52:29.701372  2763 net.cpp:156] Memory required for data: 1780464
I0423 09:52:29.701383  2763 layer_factory.hpp:77] Creating layer relu1
I0423 09:52:29.701390  2763 net.cpp:91] Creating Layer relu1
I0423 09:52:29.701395  2763 net.cpp:425] relu1 <- conv1
I0423 09:52:29.701400  2763 net.cpp:386] relu1 -> conv1 (in-place)
I0423 09:52:29.701406  2763 net.cpp:141] Setting up relu1
I0423 09:52:29.701412  2763 net.cpp:148] Top shape: 1 96 55 55 (290400)
I0423 09:52:29.701416  2763 net.cpp:156] Memory required for data: 2942064
I0423 09:52:29.701418  2763 layer_factory.hpp:77] Creating layer norm1
I0423 09:52:29.701426  2763 net.cpp:91] Creating Layer norm1
I0423 09:52:29.701429  2763 net.cpp:425] norm1 <- conv1
I0423 09:52:29.701434  2763 net.cpp:399] norm1 -> norm1
I0423 09:52:29.701464  2763 net.cpp:141] Setting up norm1
I0423 09:52:29.701479  2763 net.cpp:148] Top shape: 1 96 55 55 (290400)
I0423 09:52:29.701483  2763 net.cpp:156] Memory required for data: 4103664
I0423 09:52:29.701486  2763 layer_factory.hpp:77] Creating layer pool1
I0423 09:52:29.701503  2763 net.cpp:91] Creating Layer pool1
I0423 09:52:29.701505  2763 net.cpp:425] pool1 <- norm1
I0423 09:52:29.701510  2763 net.cpp:399] pool1 -> pool1
I0423 09:52:29.701537  2763 net.cpp:141] Setting up pool1
I0423 09:52:29.701544  2763 net.cpp:148] Top shape: 1 96 27 27 (69984)
I0423 09:52:29.701545  2763 net.cpp:156] Memory required for data: 4383600
I0423 09:52:29.701550  2763 layer_factory.hpp:77] Creating layer conv2
I0423 09:52:29.701557  2763 net.cpp:91] Creating Layer conv2
I0423 09:52:29.701561  2763 net.cpp:425] conv2 <- pool1
I0423 09:52:29.701566  2763 net.cpp:399] conv2 -> conv2
I0423 09:52:29.709951  2763 net.cpp:141] Setting up conv2
I0423 09:52:29.709987  2763 net.cpp:148] Top shape: 1 256 27 27 (186624)
I0423 09:52:29.709992  2763 net.cpp:156] Memory required for data: 5130096
I0423 09:52:29.710005  2763 layer_factory.hpp:77] Creating layer relu2
I0423 09:52:29.710014  2763 net.cpp:91] Creating Layer relu2
I0423 09:52:29.710018  2763 net.cpp:425] relu2 <- conv2
I0423 09:52:29.710026  2763 net.cpp:386] relu2 -> conv2 (in-place)
I0423 09:52:29.710033  2763 net.cpp:141] Setting up relu2
I0423 09:52:29.710039  2763 net.cpp:148] Top shape: 1 256 27 27 (186624)
I0423 09:52:29.710042  2763 net.cpp:156] Memory required for data: 5876592
I0423 09:52:29.710046  2763 layer_factory.hpp:77] Creating layer norm2
I0423 09:52:29.710057  2763 net.cpp:91] Creating Layer norm2
I0423 09:52:29.710060  2763 net.cpp:425] norm2 <- conv2
I0423 09:52:29.710067  2763 net.cpp:399] norm2 -> norm2
I0423 09:52:29.710100  2763 net.cpp:141] Setting up norm2
I0423 09:52:29.710108  2763 net.cpp:148] Top shape: 1 256 27 27 (186624)
I0423 09:52:29.710110  2763 net.cpp:156] Memory required for data: 6623088
I0423 09:52:29.710114  2763 layer_factory.hpp:77] Creating layer pool2
I0423 09:52:29.710120  2763 net.cpp:91] Creating Layer pool2
I0423 09:52:29.710124  2763 net.cpp:425] pool2 <- norm2
I0423 09:52:29.710129  2763 net.cpp:399] pool2 -> pool2
I0423 09:52:29.710155  2763 net.cpp:141] Setting up pool2
I0423 09:52:29.710171  2763 net.cpp:148] Top shape: 1 256 13 13 (43264)
I0423 09:52:29.710175  2763 net.cpp:156] Memory required for data: 6796144
I0423 09:52:29.710187  2763 layer_factory.hpp:77] Creating layer conv3
I0423 09:52:29.710197  2763 net.cpp:91] Creating Layer conv3
I0423 09:52:29.710201  2763 net.cpp:425] conv3 <- pool2
I0423 09:52:29.710207  2763 net.cpp:399] conv3 -> conv3
I0423 09:52:29.733366  2763 net.cpp:141] Setting up conv3
I0423 09:52:29.733403  2763 net.cpp:148] Top shape: 1 384 13 13 (64896)
I0423 09:52:29.733407  2763 net.cpp:156] Memory required for data: 7055728
I0423 09:52:29.733420  2763 layer_factory.hpp:77] Creating layer relu3
I0423 09:52:29.733439  2763 net.cpp:91] Creating Layer relu3
I0423 09:52:29.733444  2763 net.cpp:425] relu3 <- conv3
I0423 09:52:29.733453  2763 net.cpp:386] relu3 -> conv3 (in-place)
I0423 09:52:29.733461  2763 net.cpp:141] Setting up relu3
I0423 09:52:29.733466  2763 net.cpp:148] Top shape: 1 384 13 13 (64896)
I0423 09:52:29.733469  2763 net.cpp:156] Memory required for data: 7315312
I0423 09:52:29.733472  2763 layer_factory.hpp:77] Creating layer conv4
I0423 09:52:29.733484  2763 net.cpp:91] Creating Layer conv4
I0423 09:52:29.733489  2763 net.cpp:425] conv4 <- conv3
I0423 09:52:29.733494  2763 net.cpp:399] conv4 -> conv4
I0423 09:52:29.750310  2763 net.cpp:141] Setting up conv4
I0423 09:52:29.750344  2763 net.cpp:148] Top shape: 1 384 13 13 (64896)
I0423 09:52:29.750349  2763 net.cpp:156] Memory required for data: 7574896
I0423 09:52:29.750357  2763 layer_factory.hpp:77] Creating layer relu4
I0423 09:52:29.750366  2763 net.cpp:91] Creating Layer relu4
I0423 09:52:29.750370  2763 net.cpp:425] relu4 <- conv4
I0423 09:52:29.750376  2763 net.cpp:386] relu4 -> conv4 (in-place)
I0423 09:52:29.750393  2763 net.cpp:141] Setting up relu4
I0423 09:52:29.750397  2763 net.cpp:148] Top shape: 1 384 13 13 (64896)
I0423 09:52:29.750401  2763 net.cpp:156] Memory required for data: 7834480
I0423 09:52:29.750403  2763 layer_factory.hpp:77] Creating layer conv5
I0423 09:52:29.750414  2763 net.cpp:91] Creating Layer conv5
I0423 09:52:29.750418  2763 net.cpp:425] conv5 <- conv4
I0423 09:52:29.750423  2763 net.cpp:399] conv5 -> conv5
I0423 09:52:29.762544  2763 net.cpp:141] Setting up conv5
I0423 09:52:29.762580  2763 net.cpp:148] Top shape: 1 256 13 13 (43264)
I0423 09:52:29.762584  2763 net.cpp:156] Memory required for data: 8007536
I0423 09:52:29.762598  2763 layer_factory.hpp:77] Creating layer relu5
I0423 09:52:29.762609  2763 net.cpp:91] Creating Layer relu5
I0423 09:52:29.762614  2763 net.cpp:425] relu5 <- conv5
I0423 09:52:29.762619  2763 net.cpp:386] relu5 -> conv5 (in-place)
I0423 09:52:29.762629  2763 net.cpp:141] Setting up relu5
I0423 09:52:29.762646  2763 net.cpp:148] Top shape: 1 256 13 13 (43264)
I0423 09:52:29.762650  2763 net.cpp:156] Memory required for data: 8180592
I0423 09:52:29.762653  2763 layer_factory.hpp:77] Creating layer pool5
I0423 09:52:29.762662  2763 net.cpp:91] Creating Layer pool5
I0423 09:52:29.762665  2763 net.cpp:425] pool5 <- conv5
I0423 09:52:29.762671  2763 net.cpp:399] pool5 -> pool5
I0423 09:52:29.762707  2763 net.cpp:141] Setting up pool5
I0423 09:52:29.762724  2763 net.cpp:148] Top shape: 1 256 6 6 (9216)
I0423 09:52:29.762727  2763 net.cpp:156] Memory required for data: 8217456
I0423 09:52:29.762740  2763 layer_factory.hpp:77] Creating layer conv6
I0423 09:52:29.762753  2763 net.cpp:91] Creating Layer conv6
I0423 09:52:29.762755  2763 net.cpp:425] conv6 <- pool5
I0423 09:52:29.762761  2763 net.cpp:399] conv6 -> conv6
I0423 09:52:29.868270  2763 net.cpp:141] Setting up conv6
I0423 09:52:29.868306  2763 net.cpp:148] Top shape: 1 4096 3 3 (36864)
I0423 09:52:29.868311  2763 net.cpp:156] Memory required for data: 8364912
I0423 09:52:29.868320  2763 layer_factory.hpp:77] Creating layer relu6
I0423 09:52:29.868330  2763 net.cpp:91] Creating Layer relu6
I0423 09:52:29.868335  2763 net.cpp:425] relu6 <- conv6
I0423 09:52:29.868342  2763 net.cpp:386] relu6 -> conv6 (in-place)
I0423 09:52:29.868350  2763 net.cpp:141] Setting up relu6
I0423 09:52:29.868355  2763 net.cpp:148] Top shape: 1 4096 3 3 (36864)
I0423 09:52:29.868358  2763 net.cpp:156] Memory required for data: 8512368
I0423 09:52:29.868361  2763 layer_factory.hpp:77] Creating layer conv7
I0423 09:52:29.868372  2763 net.cpp:91] Creating Layer conv7
I0423 09:52:29.868376  2763 net.cpp:425] conv7 <- conv6
I0423 09:52:29.868381  2763 net.cpp:399] conv7 -> conv7
I0423 09:52:33.773138  2763 net.cpp:141] Setting up conv7
I0423 09:52:33.773177  2763 net.cpp:148] Top shape: 1 4096 1 1 (4096)
I0423 09:52:33.773182  2763 net.cpp:156] Memory required for data: 8528752
I0423 09:52:33.773192  2763 layer_factory.hpp:77] Creating layer relu7
I0423 09:52:33.773203  2763 net.cpp:91] Creating Layer relu7
I0423 09:52:33.773219  2763 net.cpp:425] relu7 <- conv7
I0423 09:52:33.773232  2763 net.cpp:386] relu7 -> conv7 (in-place)
I0423 09:52:33.773247  2763 net.cpp:141] Setting up relu7
I0423 09:52:33.773257  2763 net.cpp:148] Top shape: 1 4096 1 1 (4096)
I0423 09:52:33.773265  2763 net.cpp:156] Memory required for data: 8545136
I0423 09:52:33.773269  2763 layer_factory.hpp:77] Creating layer conv8
I0423 09:52:33.773283  2763 net.cpp:91] Creating Layer conv8
I0423 09:52:33.773286  2763 net.cpp:425] conv8 <- conv7
I0423 09:52:33.773293  2763 net.cpp:399] conv8 -> conv8
I0423 09:52:33.778169  2763 net.cpp:141] Setting up conv8
I0423 09:52:33.778193  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:33.778198  2763 net.cpp:156] Memory required for data: 8545308
I0423 09:52:33.778203  2763 layer_factory.hpp:77] Creating layer relu8
I0423 09:52:33.778221  2763 net.cpp:91] Creating Layer relu8
I0423 09:52:33.778226  2763 net.cpp:425] relu8 <- conv8
I0423 09:52:33.778233  2763 net.cpp:386] relu8 -> conv8 (in-place)
I0423 09:52:33.778239  2763 net.cpp:141] Setting up relu8
I0423 09:52:33.778244  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:33.778246  2763 net.cpp:156] Memory required for data: 8545480
I0423 09:52:33.778249  2763 layer_factory.hpp:77] Creating layer sigmoid
I0423 09:52:33.778255  2763 net.cpp:91] Creating Layer sigmoid
I0423 09:52:33.778260  2763 net.cpp:425] sigmoid <- conv8
I0423 09:52:33.778265  2763 net.cpp:386] sigmoid -> conv8 (in-place)
I0423 09:52:33.778270  2763 net.cpp:141] Setting up sigmoid
I0423 09:52:33.778275  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:33.778277  2763 net.cpp:156] Memory required for data: 8545652
I0423 09:52:33.778295  2763 layer_factory.hpp:77] Creating layer conv8_sigmoid_0_split
I0423 09:52:33.778301  2763 net.cpp:91] Creating Layer conv8_sigmoid_0_split
I0423 09:52:33.778303  2763 net.cpp:425] conv8_sigmoid_0_split <- conv8
I0423 09:52:33.778318  2763 net.cpp:399] conv8_sigmoid_0_split -> conv8_sigmoid_0_split_0
I0423 09:52:33.778339  2763 net.cpp:399] conv8_sigmoid_0_split -> conv8_sigmoid_0_split_1
I0423 09:52:33.778373  2763 net.cpp:141] Setting up conv8_sigmoid_0_split
I0423 09:52:33.778389  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:33.778393  2763 net.cpp:148] Top shape: 1 43 1 1 (43)
I0423 09:52:33.778408  2763 net.cpp:156] Memory required for data: 8545996
I0423 09:52:33.778411  2763 layer_factory.hpp:77] Creating layer accuracy
I0423 09:52:33.778419  2763 net.cpp:91] Creating Layer accuracy
I0423 09:52:33.778422  2763 net.cpp:425] accuracy <- conv8_sigmoid_0_split_0
I0423 09:52:33.778426  2763 net.cpp:425] accuracy <- label_data_1_split_0
I0423 09:52:33.778432  2763 net.cpp:399] accuracy -> accuracy
I0423 09:52:33.778439  2763 net.cpp:141] Setting up accuracy
I0423 09:52:33.778446  2763 net.cpp:148] Top shape: (1)
I0423 09:52:33.778452  2763 net.cpp:156] Memory required for data: 8546000
I0423 09:52:33.778457  2763 layer_factory.hpp:77] Creating layer loss
I0423 09:52:33.778477  2763 net.cpp:91] Creating Layer loss
I0423 09:52:33.778496  2763 net.cpp:425] loss <- conv8_sigmoid_0_split_1
I0423 09:52:33.778503  2763 net.cpp:425] loss <- label_data_1_split_1
I0423 09:52:33.778513  2763 net.cpp:399] loss -> loss
I0423 09:52:33.778563  2763 net.cpp:141] Setting up loss
I0423 09:52:33.778573  2763 net.cpp:148] Top shape: (1)
I0423 09:52:33.778578  2763 net.cpp:151]     with loss weight 1
I0423 09:52:33.778602  2763 net.cpp:156] Memory required for data: 8546004
I0423 09:52:33.778609  2763 net.cpp:217] loss needs backward computation.
I0423 09:52:33.778616  2763 net.cpp:219] accuracy does not need backward computation.
I0423 09:52:33.778621  2763 net.cpp:217] conv8_sigmoid_0_split needs backward computation.
I0423 09:52:33.778625  2763 net.cpp:217] sigmoid needs backward computation.
I0423 09:52:33.778627  2763 net.cpp:217] relu8 needs backward computation.
I0423 09:52:33.778630  2763 net.cpp:217] conv8 needs backward computation.
I0423 09:52:33.778633  2763 net.cpp:217] relu7 needs backward computation.
I0423 09:52:33.778636  2763 net.cpp:217] conv7 needs backward computation.
I0423 09:52:33.778640  2763 net.cpp:217] relu6 needs backward computation.
I0423 09:52:33.778642  2763 net.cpp:217] conv6 needs backward computation.
I0423 09:52:33.778646  2763 net.cpp:217] pool5 needs backward computation.
I0423 09:52:33.778651  2763 net.cpp:217] relu5 needs backward computation.
I0423 09:52:33.778655  2763 net.cpp:217] conv5 needs backward computation.
I0423 09:52:33.778657  2763 net.cpp:217] relu4 needs backward computation.
I0423 09:52:33.778661  2763 net.cpp:217] conv4 needs backward computation.
I0423 09:52:33.778664  2763 net.cpp:217] relu3 needs backward computation.
I0423 09:52:33.778666  2763 net.cpp:217] conv3 needs backward computation.
I0423 09:52:33.778671  2763 net.cpp:217] pool2 needs backward computation.
I0423 09:52:33.778673  2763 net.cpp:217] norm2 needs backward computation.
I0423 09:52:33.778677  2763 net.cpp:217] relu2 needs backward computation.
I0423 09:52:33.778681  2763 net.cpp:217] conv2 needs backward computation.
I0423 09:52:33.778684  2763 net.cpp:217] pool1 needs backward computation.
I0423 09:52:33.778687  2763 net.cpp:217] norm1 needs backward computation.
I0423 09:52:33.778692  2763 net.cpp:217] relu1 needs backward computation.
I0423 09:52:33.778694  2763 net.cpp:217] conv1 needs backward computation.
I0423 09:52:33.778698  2763 net.cpp:219] label_data_1_split does not need backward computation.
I0423 09:52:33.778702  2763 net.cpp:219] data does not need backward computation.
I0423 09:52:33.778705  2763 net.cpp:261] This network produces output accuracy
I0423 09:52:33.778709  2763 net.cpp:261] This network produces output loss
I0423 09:52:33.778728  2763 net.cpp:274] Network initialization done.
I0423 09:52:33.778976  2763 solver.cpp:60] Solver scaffolding done.
I0423 09:52:33.779458  2763 caffe.cpp:133] Finetuning from /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 09:52:34.067591  2763 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 09:52:34.067654  2763 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters.
W0423 09:52:34.067659  2763 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0423 09:52:34.067752  2763 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 09:52:34.193063  2763 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter
I0423 09:52:34.196166  2763 net.cpp:753] Ignoring source layer fc6
I0423 09:52:34.196195  2763 net.cpp:753] Ignoring source layer drop6
I0423 09:52:34.196199  2763 net.cpp:753] Ignoring source layer fc7
I0423 09:52:34.196203  2763 net.cpp:753] Ignoring source layer drop7
I0423 09:52:34.196207  2763 net.cpp:753] Ignoring source layer fc8
I0423 09:52:34.491250  2763 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 09:52:34.491279  2763 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters.
W0423 09:52:34.491284  2763 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0423 09:52:34.491298  2763 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 09:52:34.615309  2763 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter
I0423 09:52:34.617781  2763 net.cpp:753] Ignoring source layer fc6
I0423 09:52:34.617805  2763 net.cpp:753] Ignoring source layer drop6
I0423 09:52:34.617808  2763 net.cpp:753] Ignoring source layer fc7
I0423 09:52:34.617812  2763 net.cpp:753] Ignoring source layer drop7
I0423 09:52:34.617815  2763 net.cpp:753] Ignoring source layer fc8
I0423 09:52:34.619755  2763 caffe.cpp:223] Starting Optimization
I0423 09:52:34.619771  2763 solver.cpp:279] Solving AlexNet
I0423 09:52:34.619776  2763 solver.cpp:280] Learning Rate Policy: step
I0423 09:52:35.070583  2763 solver.cpp:228] Iteration 0, loss = 7.51117
I0423 09:52:35.070628  2763 sgd_solver.cpp:106] Iteration 0, lr = 0.001
F0423 09:52:35.071538  2763 syncedmem.cpp:56] Check failed: error == cudaSuccess (2 vs. 0)  out of memory
*** Check failure stack trace: ***
    @     0x7f3d97747daa  (unknown)
    @     0x7f3d97747ce4  (unknown)
    @     0x7f3d977476e6  (unknown)
    @     0x7f3d9774a687  (unknown)
    @     0x7f3d97e0fbd1  caffe::SyncedMemory::to_gpu()
    @     0x7f3d97e0ef39  caffe::SyncedMemory::mutable_gpu_data()
    @     0x7f3d97e76c02  caffe::Blob<>::mutable_gpu_data()
    @     0x7f3d97e8857c  caffe::SGDSolver<>::ComputeUpdateValue()
    @     0x7f3d97e88f73  caffe::SGDSolver<>::ApplyUpdate()
    @     0x7f3d97e2827c  caffe::Solver<>::Step()
    @     0x7f3d97e288c9  caffe::Solver<>::Solve()
    @           0x408abe  train()
    @           0x405f8c  main
    @     0x7f3d96a55ec5  (unknown)
    @           0x4066c1  (unknown)
    @              (nil)  (unknown)

View Code

 

 

  Later, we will concentrate on how to locate the target object and shown us the feature from each Convolution layers.

  Waiting and Continuing ...

 

 

 

  All right, the terminal shown me this, oh, my god ... Wrong ! Wrong ! Wrong !!!

  The loss = nan , fuck, how it possible ???

 

 

Due to the base_lr = 0.001, and change into base_lr = 0.000001, the loss become normal.

 

 

 

时间: 2024-09-25 04:03:59

how to change the AlexNet into FCNs ?的相关文章

(转)ResNet, AlexNet, VGG, Inception: Understanding various architectures of Convolutional Networks

  ResNet, AlexNet, VGG, Inception: Understanding various architectures of Convolutional Networks by KOUSTUBH        this blog from: http://cv-tricks.com/cnn/understand-resnet-alexnet-vgg-inception/ Convolutional neural networks are fantastic for visu

网站设计欣赏:Change.gov简洁大气充满成熟之美

文章描述:Change.gov-值得学习的美丽细节. 前段时间,突然想去看看奥巴马的网站是怎么设计的,于是Google到了Change.gov. 刹那间眼前一亮---简洁大气,充满了成熟之美. 一直想写篇赏析来学习下Magicwan的设计,一直拖到今天(懒惰的人啊) 一.Background 深蓝色背景,给人踏实,稳重,具有可靠信赖感. 二.Head 1.在logo和辅助图形之间,没有使用100%的直线而是一条柔和的渐变线.即没有破坏head的统一感,又起到了区分logo和图形的作用. 2.背景

Use Nid to Change dbname

link: http://www.eygle.com/faq/Use.Nid.to.Change.Your.dbname.htm Nid是Oracle从9iR2开始提供的工具,可以用来更改数据库名称,而无需通过之前重建控制文件等繁琐方式.需要说明的是,虽然这个工具来自9iR2,但是仍然可以被用于Oracle8i. 先看一下帮助: C:\>nid -helpDBNEWID: Release 10.1.0.2.0 - ProductionCopyright (c) 2001, 2004, Oracl

SEOer的天赋在于随着需求进行Change

  "Change"(改变)是美国总统奥巴马2008年竞选美国总统的时候提出的口号,奥巴马这个口号及思想与人们在21求变求创新的思想不谋而合,最奥巴马做上了美国总统.奥巴马说过"美国的真正天赋在于,它懂的改变."那么SEOer的天赋在与什么呢?SEOer的天赋在于随着需求进行Change. 在21世纪,特别是在中国,不管是新型的电子商务网站还是传统的企业网站,对与SEO的要求已经在慢慢的改变了.这对于我们SEOer来说是已经好事,市场对SEO的需求越来越大了,SEO

vsftpd 500 OOPS: cannot change directory问题

今天在配置ftp时,所有配置完毕后,,启动vsftpd,通过用户登陆ftp,居然报错: C:\>ftp 10.10.1.239 Connected to 10.10.1.239 220 (vsFTPd 2.0.5) User (10.10.1.239:(none)): xaftp 331 Please specify the password. Password: 500 OOPS: cannot change directory:/home/xaftp Login failed. ftp>

IBM Infosphere Data Replication产品族Replication Server与Change Data Cap

IBM Infosphere Data Replication产品族Replication Server与Change Data Capture的异同比较 一,简介 在如今信息快速变化的商业时代,必须在第一时间做出商业决策并采取行动才能在激烈的竞争中保持领先地位.如果商业数据不能保证同步,那么生产和利润势必会遭受损失,但是,面对信息量激增并且分布存储的特点,保证数据的可信性并非易事. IBM 的 InfoSphere Data Replication 产品族针对这一问题为应用提供了一系列数据同步

浅析SQL 2008的Change Data Capture功能

在常见的企业数据平台管理中有一项任务是一直困扰SQL Server DBA们的,这就是对数据更新的监控.很多数 据应用都需要捕获对业务数据表的更新.笔者见过几种解决方案: 1.在数据表中加入特殊的标志列: 2. 通过在数据表上创建触发器: 3.通过第三方产品,例如Lumigent的Log Explorer. 其实第1种和第2中方案都不好,因为第1种方法需要在应用程序编码的时候尤为小心,如果有一段数据访问逻辑 忘了更新标志位就会导致遗漏某些数据更新,而第2种方法对性能影响过于明显,因为触发器的性能

SQL Server 2008:CDC和Change Tracking

经常会有需求记录用户表中由DML操作(Insert/Updae/Delete)引起的数据变化,在SQL Server 2008 以前的版本中,要实现这样的功能只能通过Trigger或者数据比对(例如SCD处理),而且必须针对每个用户 表开发.SQL Server 2008中新增了两种记录数据变化的功能,本文就Change Data Capture(CDC)和 Change Tracking的特性做简要对比. Change Data Capture CDC通过对事务日志的异步读取,记录DML操作的

UVa 674 Coin Change (DP)

674 - Coin Change Time limit: 3.000 seconds http://uva.onlinejudge.org/index.php?option=com_onlinejudge&Itemid=8&category=114&page=show_problem&problem=615 Suppose there are 5 types of coins: 50-cent, 25-cent, 10-cent, 5-cent, and 1-cent.