【问题标题】:CNN with multiple conv3d in keras在 keras 中具有多个 conv3d 的 CNN
【发布时间】:2018-09-19 02:24:23
【问题描述】:

我正在尝试在 Keras 中创建一个具有多个 conv3d 的 CNN 模型来处理 cifar10 数据集。但面临以下问题:

ValueError:('指定的尺寸包含一个尺寸

下面是我正在尝试执行的代码。

from __future__ import print_function
import keras
from keras.datasets import cifar10
from keras.preprocessing.image import ImageDataGenerator
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Conv3D, MaxPooling3D
from keras.optimizers import SGD
import os
from keras import backend as K

batch_size = 128
num_classes = 10
epochs = 20
learning_rate = 0.01

(x_train, y_train), (x_test, y_test) = cifar10.load_data()
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')
img_rows = x_train.shape[1]
img_cols = x_train.shape[2]
colors = x_train.shape[3]


if K.image_data_format() == 'channels_first':
    x_train = x_train.reshape(x_train.shape[0], 1,colors, img_rows, img_cols)
    x_test = x_test.reshape(x_test.shape[0], 1,colors, img_rows, img_cols)
    input_shape = (1, colors, img_rows, img_cols)
else:
    x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, colors, 1)
    x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, colors, 1)
    input_shape = (img_rows, img_cols, colors, 1)


# Convert class vectors to binary class matrices.
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)

model = Sequential()
model.add(Conv3D(32, kernel_size=(3, 3, 3),activation='relu',input_shape=input_shape))
model.add(Conv3D(32, kernel_size=(3, 3, 3),activation='relu'))
model.add(MaxPooling3D(pool_size=(2, 2, 1)))
model.add(Dropout(0.25))
model.add(Conv3D(64, kernel_size=(3, 3, 3),activation='relu'))
model.add(Conv3D(64, kernel_size=(3, 3, 3),activation='relu'))
model.add(MaxPooling3D(pool_size=(2, 2, 1)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(256, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

sgd=SGD(lr=learning_rate)


model.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=sgd,
              metrics=['accuracy'])

history = model.fit(x_train, y_train,
          batch_size=batch_size,
          epochs=epochs,
          verbose=1,
          validation_data=(x_test, y_test))

score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])

我尝试过 single conv3d 并且它工作,但准确度非常低。代码 sn-p 如下:

model = Sequential()
model.add(Conv3D(32, kernel_size=(3, 3, 3),activation='relu',input_shape=input_shape))
model.add(MaxPooling3D(pool_size=(2, 2, 1)))
model.add(Flatten())
model.add(Dense(256, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

【问题讨论】:

    标签: tensorflow deep-learning keras conv-neural-network convolution


    【解决方案1】:

    问题

    问题在于颜色通道:它最初等于 3,而您正在应用大小为 3padding='valid' 的卷积。在第一个Conv3D 之后,输出张量是:

    (None, 30, 30, 1, 32)
    

    ... 并且不能对那个维度应用更多的卷积。您提供的简单示例之所以有效,仅仅是因为只有一个卷积层。

    解决方案

    您的一个选择是设置padding='same',以便保留张量形状:

    (None, 32, 32, 3, 32)
    

    但是,对我来说,颜色卷积并没有增加很多价值,所以我会选择这个模型:

    model = Sequential()
    model.add(Conv3D(32, kernel_size=(3, 3, 1), activation='relu', input_shape=input_shape))
    model.add(Conv3D(32, kernel_size=(3, 3, 1), activation='relu'))
    model.add(MaxPooling3D(pool_size=(2, 2, 1)))
    
    model.add(Dropout(0.25))
    model.add(Conv3D(64, kernel_size=(3, 3, 1), activation='relu'))
    model.add(Conv3D(64, kernel_size=(3, 3, 1), activation='relu'))
    model.add(MaxPooling3D(pool_size=(2, 2, 1)))
    model.add(Dropout(0.25))
    model.add(Flatten())
    model.add(Dense(256, activation='relu'))
    model.add(Dense(10, activation='softmax'))
    

    【讨论】:

      【解决方案2】:

      在实践中,在卷积层中,维度被保留,在池化层中,您可以向下采样。 问题是你在这里失去了维度。因此,您可以设置 padding same 或使用 1 个通道的 3X3 过滤器,而不是使用 3 个通道。

      【讨论】:

        猜你喜欢
        • 2019-07-18
        • 1970-01-01
        • 2018-05-27
        • 2019-07-24
        • 2019-10-22
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        相关资源
        最近更新 更多