【问题标题】:Shapes Incompatible in Keras with CNNKeras 中的形状与 CNN 不兼容
【发布时间】:2020-06-12 14:19:09
【问题描述】:

我正在实现一个获取 2d 图像并为其输出 3D 二进制体素的网络。 我正在使用带有 LSTM 模块的自动编码器。 目前图像和体素的形状如下:

print(x_train.shape)
print(y_train.shape)
>>> (792, 127, 127, 3)
>>> (792, 32, 32, 32)

792 RGB 图像 127 x 127

792 个对应的 3D 二进制张量体素 (32 x 32 x 32)

运行以下编码器模型:

from tensorflow.keras import Sequential
from tensorflow.keras.layers import Conv2D, LeakyReLU, MaxPooling2D, Dense, Flatten, Conv3D, MaxPool3D, GRU, Reshape, UpSampling3D
from tensorflow import keras



enc_filter = [96, 128, 256, 256, 256, 256]
fc_filters = [1024]

model = Sequential()
epochs = 5
batch_size = 24
input_shape=(127,127,3)

model.add(Conv2D(enc_filter[0], kernel_size=(7, 7), strides=(1,1),activation='relu',input_shape=input_shape))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(LeakyReLU(alpha=0.1))
model.add(Flatten())
model.add(Dense(1024, activation='relu'))

model.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=keras.optimizers.SGD(lr=0.01),
              metrics=['accuracy'])
model.fit(x_train, y_train,
          batch_size=batch_size,
          epochs=epochs)

产生以下结果:

ValueError: Shapes (24, 32, 32, 32) and (24, 1024) are incompatible

有人能解释为什么这些形状不兼容吗?我尝试移除层并测试其他层,但都产生了兼容性问题。

【问题讨论】:

    标签: keras deep-learning dimensionality-reduction


    【解决方案1】:

    您的模型有一个密集层,输出为 1024,但您传递的是 32、32、32 形状的数组。

    您需要重塑模型输出,使其具有适当的形状。

    这是一个虚拟模型,您需要更改参数以找到合适的架构。

    from tensorflow.keras import Sequential
    from tensorflow.keras.layers import Conv2D, LeakyReLU, MaxPooling2D, Dense, Flatten, Conv3D, MaxPool3D, GRU, Reshape, UpSampling3D
    from tensorflow import keras
    
    import numpy as np
    
    # dummy data
    x_train = np.random.randn(792, 127, 127, 3)
    y_train = np.random.randn(792, 32, 32, 32)
    
    enc_filter = [96, 128, 256, 2]
    fc_filters = [1024]
    
    model = Sequential()
    epochs = 5
    batch_size = 24
    input_shape=(127,127,3)
    
    model.add(Conv2D(enc_filter[0], kernel_size=(7, 7), strides=(1,1),activation='relu',input_shape=input_shape))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(LeakyReLU(alpha=0.1))
    
    model.add(Conv2D(enc_filter[1], kernel_size=(7, 7), strides=(1,1),activation='relu',input_shape=input_shape))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(LeakyReLU(alpha=0.1))
    
    model.add(Conv2D(enc_filter[2], kernel_size=(7, 7), strides=(1,1),activation='relu',input_shape=input_shape))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(LeakyReLU(alpha=0.1))
    
    model.add(Conv2D(enc_filter[3], kernel_size=(7, 7), strides=(1,1),activation='relu',input_shape=input_shape)) # bottolneck
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(LeakyReLU(alpha=0.1))
    
    model.add(Flatten())
    model.add(Dense(32*32*32, activation='relu'))
    
    model.add(Reshape((32,32,32)))
    
    model.compile(loss=keras.losses.categorical_crossentropy,
                  optimizer=keras.optimizers.SGD(lr=0.01),
                  metrics=['accuracy'])
    
    model.summary()
    model.fit(x_train, y_train,
              batch_size=batch_size,
              epochs=epochs)
    
    Model: "sequential_10"
    _________________________________________________________________
    Layer (type)                 Output Shape              Param #   
    =================================================================
    conv2d_24 (Conv2D)           (None, 121, 121, 96)      14208     
    _________________________________________________________________
    max_pooling2d_24 (MaxPooling (None, 60, 60, 96)        0         
    _________________________________________________________________
    leaky_re_lu_24 (LeakyReLU)   (None, 60, 60, 96)        0         
    _________________________________________________________________
    conv2d_25 (Conv2D)           (None, 54, 54, 128)       602240    
    _________________________________________________________________
    max_pooling2d_25 (MaxPooling (None, 27, 27, 128)       0         
    _________________________________________________________________
    leaky_re_lu_25 (LeakyReLU)   (None, 27, 27, 128)       0         
    _________________________________________________________________
    conv2d_26 (Conv2D)           (None, 21, 21, 256)       1605888   
    _________________________________________________________________
    max_pooling2d_26 (MaxPooling (None, 10, 10, 256)       0         
    _________________________________________________________________
    leaky_re_lu_26 (LeakyReLU)   (None, 10, 10, 256)       0         
    _________________________________________________________________
    conv2d_27 (Conv2D)           (None, 4, 4, 2)           25090     
    _________________________________________________________________
    max_pooling2d_27 (MaxPooling (None, 2, 2, 2)           0         
    _________________________________________________________________
    leaky_re_lu_27 (LeakyReLU)   (None, 2, 2, 2)           0         
    _________________________________________________________________
    flatten_10 (Flatten)         (None, 8)                 0         
    _________________________________________________________________
    dense_1 (Dense)              (None, 32768)             294912    
    _________________________________________________________________
    reshape_10 (Reshape)         (None, 32, 32, 32)        0         
    =================================================================
    Total params: 2,542,338
    Trainable params: 2,542,338
    Non-trainable params: 0
    

    在总结中,您可以看到我添加了一个具有 32x32x32 神经元的密集层,然后对其进行重塑。

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2020-08-24
      • 1970-01-01
      • 2018-10-21
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多