【问题标题】:custom layer with Tensorflow 2.1 problem with the output shape带有 Tensorflow 2.1 输出形状问题的自定义层
【发布时间】:2020-05-08 07:53:22
【问题描述】:

我正在尝试让自定义层返回 (25,1) 的张量,但是应该传递一个 batch_size(我从下一层得到一个错误)。我找了一些例子,但不知道如何指定输出形状。

此外,我需要一个独立于输入大小的任意输出形状,因为计算(不是以下示例的一部分)将始终返回固定数量的值。

我尝试了以下方法:

class SimpleLayer(layers.Layer):
    def __init__(self, **kwargs):
        super(SimpleLayer,  self).__init__(**kwargs)
        self.baseline = tf.Variable(initial_value=0.1, trainable=True)

    def call(self, inputs):
        print ("in call inputs:", inputs.shape)
        ret = tf.zeros((25, 1)) + self.baseline
        print("Ret:", ret, "Shape", tf.shape(ret))
        return (ret)

返回:

Ret: Tensor("om/add:0", shape=(25, 1), dtype=float32) Shape Tensor("om/Shape:0", shape=(2,), dtype=int32)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
inputs (InputLayer)          [(None, 150, 1)]          0         
_________________________________________________________________
dense (Dense)                (None, 150, 256)          512       
_________________________________________________________________
om (SimpleLayer)             (25, 1)                   1         
=================================================================

但这确实会产生输出形状 (25, 1) 而不是 (None, 25, 1)。

然后我尝试了:

class SimpleLayer(layers.Layer):
    def __init__(self, **kwargs):
        super(SimpleLayer,  self).__init__(**kwargs)
        self.baseline = tf.Variable(initial_value=0.1, trainable=True)

    def call(self, inputs):
        print ("in call inputs:", inputs.shape)
        ret = tf.zeros((25, 1)) + self.baseline
        return (ret)

并得到错误:

TypeError: Expected int32, got None of type 'NoneType' instead.

有什么建议吗?

【问题讨论】:

    标签: python tensorflow


    【解决方案1】:

    我建议您使用调用方法中定义的输入数据,否则该层没有意义

    我提供了一个虚拟示例并且完美运行

    class SimpleLayer(tf.keras.layers.Layer):
        def __init__(self, **kwargs):
            super(SimpleLayer,  self).__init__(**kwargs)
            self.baseline = tf.Variable(initial_value=0.1, trainable=True)
    
        def call(self, inputs):
            ret = inputs + self.baseline
            return (ret)
    
        def compute_output_shape(self, input_shape):
            return (input_shape[0], input_shape[1], input_shape[2])
    

    使用 SimpleLayer 创建模型

    inp = Input(shape=(25,1))
    x = SimpleLayer()(inp)
    out = Dense(3)(x)
    model = Model(inp, out)
    model.summary()
    

    总结:

    Layer (type)                 Output Shape              Param #   
    =================================================================
    input_10 (InputLayer)        [(None, 25, 1)]           0         
    _________________________________________________________________
    simple_layer_16 (SimpleLayer (None, 25, 1)             1         
    _________________________________________________________________
    dense_22 (Dense)             (None, 25, 3)             6         
    =================================================================
    Total params: 7
    Trainable params: 7
    Non-trainable params: 0
    

    编辑

    我尝试用这种方式覆盖None维度的问题

    class SimpleLayer(tf.keras.layers.Layer):
        def __init__(self, **kwargs):
            super(SimpleLayer,  self).__init__(**kwargs)
            self.baseline = tf.Variable(initial_value=0.1, trainable=True, dtype=tf.float64)
    
        def call(self, inputs):
            ret = tf.zeros((1, 25, 1), dtype=tf.float64) + self.baseline
            ret = tf.compat.v1.placeholder_with_default(ret, (None, 25, 1))
            return (ret)
    
    inp = Input((150,1))
    x = Dense(256)(inp)
    x = SimpleLayer()(x)
    x = Dense(10)(x)
    
    model = Model(inp, x)
    model.summary()
    

    总结:

    _________________________________________________________________
    Layer (type)                 Output Shape              Param #   
    =================================================================
    input_34 (InputLayer)        [(None, 150, 1)]          0         
    _________________________________________________________________
    dense_68 (Dense)             (None, 150, 256)          512       
    _________________________________________________________________
    simple_layer_9 (SimpleLayer) (None, 25, 1)             1         
    _________________________________________________________________
    dense_69 (Dense)             (None, 25, 10)            20        
    =================================================================
    Total params: 533
    Trainable params: 533
    Non-trainable params: 0
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2021-07-15
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2021-09-17
      • 2020-09-05
      • 2021-01-26
      相关资源
      最近更新 更多