【问题标题】:Is it constant for self.scale variables defined in constructor function?构造函数中定义的 self.scale 变量是否为常量?
【发布时间】:2018-05-25 02:38:48
【问题描述】:

我不知道千层面函数的运行机制。 对于下面的代码。

class WScaleLayer(lasagne.layers.Layer):
    def __init__(self, incoming, **kwargs):
        super(WScaleLayer, self).__init__(incoming, **kwargs)
        W = incoming.W.get_value()
        scale = np.sqrt(np.mean(W ** 2))
        incoming.W.set_value(W / scale)
        self.scale = self.add_param(scale, (), name='scale', trainable=False)
        self.b = None
        if hasattr(incoming, 'b') and incoming.b is not None:
            b = incoming.b.get_value()
            self.b = self.add_param(b, b.shape, name='b', regularizable=False)
            del incoming.params[incoming.b]
            incoming.b = None
        self.nonlinearity = lasagne.nonlinearities.linear
        if hasattr(incoming, 'nonlinearity') and incoming.nonlinearity is not None:
            self.nonlinearity = incoming.nonlinearity
            incoming.nonlinearity = lasagne.nonlinearities.linear

    def get_output_for(self, v, **kwargs):
        v = v * self.scale
        if self.b is not None:
            pattern = ['x', 0] + ['x'] * (v.ndim - 2)
            v = v + self.b.dimshuffle(*pattern)
return self.nonlinearity(v)

能否告诉我self.scale在初始化后的训练过程中是否为常数?

【问题讨论】:

    标签: python deep-learning theano lasagne


    【解决方案1】:

    我不是 Lasagne 专家,但除非你做奇怪的事情,self.scale 在训练期间不应该改变。

    但是这段代码很奇怪。您使用传入权重的初始值来初始化比例。这真的是你想要的吗?

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2023-03-03
      • 1970-01-01
      • 2011-10-31
      • 1970-01-01
      • 1970-01-01
      • 2020-10-18
      • 2015-07-24
      • 1970-01-01
      相关资源
      最近更新 更多