【问题标题】:Input 0 is incompatible with layer conv2d_transpose_1: expected ndim=4, found ndim=2输入 0 与层 conv2d_transpose_1 不兼容:预期 ndim=4,发现 ndim=2
【发布时间】:2018-06-26 14:10:03
【问题描述】:

在通过反卷积馈送图层之前,我无法重新塑造图层。我不知道如何在卷积中反转展平层。 感谢您的帮助!

def build_deep_autoencoder(img_shape, code_size):
H,W,C = img_shape
encoder = keras.models.Sequential()
encoder.add(L.InputLayer(img_shape))
encoder.add(L.Conv2D(32, (3,3), padding = 'same', activation = 'elu', name='layer_1'))
encoder.add(L.MaxPooling2D((3,3), padding = 'same',name = 'max_pooling_1'))
encoder.add(L.Conv2D(64, (3,3), padding = 'same', activation = 'elu', name='layer_2'))
encoder.add(L.MaxPooling2D((3,3),padding = 'same',name = 'max_pooling_2'))
encoder.add(L.Conv2D(128, (3,3), padding = 'same', activation = 'elu', name='layer_3'))
encoder.add(L.MaxPooling2D((3,3),padding = 'same',name = 'max_pooling_3'))
encoder.add(L.Conv2D(256, (3,3), padding = 'same', activation = 'elu', name='layer_4'))
encoder.add(L.MaxPooling2D((3,3),padding = 'same',name = 'max_pooling_4'))

encoder.add(L.Flatten())
encoder.add(L.Dense(256))

# decoder
decoder = keras.models.Sequential()
decoder.add(L.InputLayer((code_size,)))
decoder.add(L.Dense(256))
decoder.add(L.Conv2DTranspose(filters=128, kernel_size=(3, 3), strides=2, activation='elu', padding='same'))
decoder.add(L.Conv2DTranspose(filters=64, kernel_size=(3, 3), strides=2, activation='elu', padding='same'))
decoder.add(L.Conv2DTranspose(filters=32, kernel_size=(3, 3), strides=2, activation='elu', padding='same'))
decoder.add(L.Conv2DTranspose(filters=3, kernel_size=(3, 3), strides=2, activation='none', padding='same'))



return encoder, decoder

【问题讨论】:

    标签: deep-learning conv-neural-network autoencoder deconvolution


    【解决方案1】:

    在您的编码器中,使用以下内容而不是添加 256 的密集层:

    decoder.add(L.Dense(2*2*256))             #actual encoder 
    decoder.add(L.Reshape((2,2,256)))         #un-flatten
    

    【讨论】:

      猜你喜欢
      • 2021-11-02
      • 1970-01-01
      • 2020-06-12
      • 1970-01-01
      • 1970-01-01
      • 2019-04-14
      • 2018-09-25
      • 2017-11-18
      • 2020-04-18
      相关资源
      最近更新 更多