【问题标题】:The input to a flatten layer must be a tensor展平层的输入必须是张量
【发布时间】:2019-10-13 16:20:18
【问题描述】:

我有以下 keras 模型,运行正常:

model = Sequential()
model.add(Flatten(input_shape=(1,1,68)))
model.add(Dense(35,activation='linear'))
model.add(LeakyReLU(alpha=.001))
model.add(Dense(nb_actions))
model.add(Activation('linear'))

然后,我尝试做一些更详细的东西,如下所示:

model = Sequential()
input1 = keras.layers.Flatten(input_shape=(1,1,68))
x1 = keras.layers.Dense(68, activation='linear')(input1)
x2 = keras.layers.Dense(68, activation='relu')(input1)
x3 = keras.layers.Dense(68, activation='sigmoid')(input1)
add1 = keras.layers.Add()([x1, x2, x3])
activ1 = keras.layers.advanced_activations.LeakyReLU(add1)

x4 = keras.layers.Dense(34, activation='linear')(activ1)
x5 = keras.layers.Dense(34, activation='relu')(activ1)
x6 = keras.layers.Dense(34, activation='sigmoid')(activ1)
add2 = keras.layers.Add()([x4, x5, x6])
activ2 = keras.layers.advanced_activations.LeakyReLU(add2)

x7 = keras.layers.Dense(17, activation='linear')(activ2)
x8 = keras.layers.Dense(17, activation='relu')(activ2)
x9 = keras.layers.Dense(17, activation='sigmoid')(activ2)
add2 = keras.layers.Add()([x4, x5, x6])
activ3 = keras.layers.advanced_activations.LeakyReLU(add3)

final_layer=keras.layers.Dense(nb_actions, activation='linear')(activ3)
model = keras.models.Model(inputs=input1, outputs=final_layer)

正如您在上面的代码中看到的那样,我保持来自 Flatten 层的相同输入,并且只是对具有相同数量神经元的层求和,但激活方式不同。我的问题是当我尝试运行此代码时。我总是收到以下错误:

Using TensorFlow backend. Traceback (most recent call last):   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/engine/base_layer.py", line 279, in assert_input_compatibility
    K.is_keras_tensor(x)   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/backend/tensorflow_backend.py", line 474, in is_keras_tensor
    str(type(x)) + '`. ' ValueError: Unexpectedly found an instance of type class keras.layers.core.Flatten. Expected a symbolic tensor instance.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):   File "main.py", line 64, in <module>
    x1 = keras.layers.Dense(68, activation='linear')(input1)   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/engine/base_layer.py", line 414, in __call__
    self.assert_input_compatibility(inputs)   File "/home/anselmo/virtualenvironment/virtualenvironment_anselmo2/lib/python3.5/site-packages/keras/engine/base_layer.py", line 285, in assert_input_compatibility
    str(inputs) + '. All inputs to the layer ' ValueError: Layer dense_1 was called with an input that isn't a symbolic tensor. Received type: class keras.layers.core.Flatten. Full input: [keras.layers.core.Flatten object at 0x7f0a145d6438]. All inputs to the layer should be tensors.

当我运行之前的代码时,错误并没有发生。那么为什么改变网络设计会出现这个错误呢?我该如何解决?我的错在哪里?

【问题讨论】:

    标签: python tensorflow keras deep-learning keras-layer


    【解决方案1】:

    您在第二个代码中尝试的是 Keras 函数模型,而不是顺序模型。您应该将第一行从model = Sequential() 更改为input1 = Input(shape=(1, 1, 68))

    更多详情请访问official documentation

    【讨论】:

    • 错误消失了。现在我有另一个错误: ValueError: setting an array element with a sequence。可能是因为我正在使用 Pandas 读取数据。我可以使用我在这个线程中显示的相同代码来解决这个问题吗?谢谢。
    • 您应该将数据转换为 numpy 数组。
    • 感谢您的评论。但是,错误发生在 LeakyRelu 激活层(activ1 = keras.layers.advanced_activations.LeakyReLU(add1)),当我删除这部分代码时,网络运行正常。如果你能帮助我,我将不胜感激。否则,我可以打开一个新线程。谢谢。
    • 我会看看你是否可以用回溯告诉我错误
    • 这是因为你的语法错误。应该是activ1 = keras.layers.advanced_activations.LeakyReLU(alpha=some_value_in_float)(add1)
    猜你喜欢
    • 2017-09-23
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2022-06-13
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多