Caffe中的最小最大标准化层

Min-Max normalization Layer in Caffe

本文关键字:标准化 Caffe      更新时间:2023-10-16

我是caffe中的新手,我试图通过最小值归一化的0到1之间的卷积输出标准化。

out = x -xmin/(xmax -xmin)

我已经检查了许多层(功率,比例,批处理归一化,MVN),但没有人在层中给我Min-Max归一化输出。有人可以帮我吗?

************

name: "normalizationCheck"
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param { shape: { dim: 1 dim: 1 dim: 512 dim: 512 } }
}
layer {
  name: "normalize1"
  type: "Power"
  bottom: "data"
  top: "normalize1"
  power_param { 
    shift: 0
    scale: 0.00392156862
    power: 1
   }
}
layer {
    bottom: "normalize1"
    top: "Output"
    name: "conv1"
    type: "Convolution"
    convolution_param {
        num_output: 1
        kernel_size: 1
        pad: 0
        stride: 1
        bias_term: false
        weight_filler {
        type: "constant"
        value: 1
        }
    }
}

卷积层的输出不以归一化形式,我希望以层格式的最小最大归一化输出。我可以使用代码手动执行,但需要分层。谢谢

您可以按照以下准则编写自己的C 图层,您将看到如何在该页面中实现"仅向前"层。

另外,您可以在Python中实现该图层,并通过'" Python"'''lase:

在caffe中执行。

首先,在Python中实现您的图层,将其存储在'/path/to/my_min_max_layer.py'中:

import caffe
import numpy as np
class min_max_forward_layer(caffe.Layer):
  def setup(self, bottom, top):
    # make sure only one input and one output
    assert len(bottom)==1 and len(top)==1, "min_max_layer expects a single input and a single output"
  def reshape(self, bottom, top):
    # reshape output to be identical to input
    top[0].reshape(*bottom[0].data.shape)
  def forward(self, bottom, top):
    # YOUR IMPLEMENTATION HERE!!
    in_ = np.array(bottom[0].data)
    x_min = in_.min()
    x_max = in_.max()
    top[0].data[...] = (in_-x_min)/(x_max-x_min)
  def backward(self, top, propagate_down, bottom):
    # backward pass is not implemented!
    pass

在Python中实现了层后,您只需将其添加到网络中(确保'/path/to'在您的$PYTHONPATH中):

layer {
  name: "my_min_max_forward_layer"
  type: "Python"
  bottom: "name_your_input_here"
  top: "name_your_output_here"
  python_param {
    module: "my_min_max_layer"  # name of python file to be imported
    layer: "min_max_forward_layer" # name of layer class
  }
}