【问题标题】:Why accuracy of my image segmentation model does not change?为什么我的图像分割模型的准确性没有改变?
【发布时间】:2021-07-17 02:55:03
【问题描述】:

我正在研究组织病理学图像分割项目。我为此建立了一个模型,但准确性在各个时期始终保持不变。它始终为 0.5000。我需要改进它。我之前更改了学习率、批量大小、时期(我尝试增加/减少它)、优化器(我尝试过 SGD、RMSPROP、ADAM)等。但仍然没有任何变化。我该怎么做?提前感谢您的帮助。

这是我的模型代码:

depth=3
class Net:
    @staticmethod
    def build(img_width, img_height, depth, classes):
        model = Sequential()
        chanDim = -1
        inputShape =(input_shape)
        model.add(SeparableConv2D(32, (3, 3), padding="same",input_shape = inputShape))
        model.add(Activation("relu"))
        model.add(BatchNormalization(axis=chanDim))
        model.add(MaxPooling2D(pool_size=(2, 2)))
        model.add(Dropout(0.25))
        # (CONV => RELU => POOL) * 2
        model.add(SeparableConv2D(64, (3, 3), padding="same"))
        model.add(Activation("relu"))
        model.add(BatchNormalization(axis=chanDim))   
        model.add(SeparableConv2D(64, (3, 3), padding="same"))
        model.add(Activation("relu"))
        model.add(BatchNormalization(axis=chanDim))
        model.add(MaxPooling2D(pool_size=(2, 2)))
        model.add(Dropout(0.25))

     
        model.add(SeparableConv2D(128, (3, 3), padding="same"))
        model.add(Activation("relu"))
        model.add(BatchNormalization(axis=chanDim))
        model.add(SeparableConv2D(128, (3, 3), padding="same"))
        model.add(Activation("relu"))
        model.add(BatchNormalization(axis=chanDim))
        model.add(SeparableConv2D(128, (3, 3), padding="same"))
        model.add(Activation("relu"))
        model.add(BatchNormalization(axis=chanDim))
        model.add(MaxPooling2D(pool_size=(2, 2)))
        model.add(Dropout(0.25))
        
       
        model.add(Flatten())
        model.add(Dense(256))
        model.add(Activation("relu"))
        model.add(BatchNormalization())
        model.add(Dropout(0.2))

      
        model.add(Dense(64))
        model.add(Activation("softmax"))
        model.add(Dropout(1))

        model.summary()
        return model
    model_history = model.fit_generator(img_train_gen,
              steps_per_epoch = train_steps,  
              epochs=10, 
              verbose=1, 
              validation_data=img_val_gen,
              validation_steps= val_steps)
model.save('nucleiproject.h5')

结果的准确性:

64/64 [==============================] - 69s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 2/10
64/64 [==============================] - 66s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 3/10
64/64 [==============================] - 65s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 4/10
64/64 [==============================] - 63s 982ms/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 5/10
64/64 [==============================] - 64s 997ms/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 6/10
64/64 [==============================] - 63s 979ms/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 7/10
64/64 [==============================] - 67s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 8/10
64/64 [==============================] - 67s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 9/10
64/64 [==============================] - 69s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000
Epoch 10/10
64/64 [==============================] - 75s 1s/step - loss: nan - accuracy: 0.5000 - val_loss: nan - val_accuracy: 0.5000

【问题讨论】:

  • 为什么最后是Dropout(1)?你知道Dropout(1)在实践中的含义吗?
  • 我想减少过拟合。此外,它会在没有 Dropout(1) 的情况下给出错误 InvalidArgumentError: Incompatible shapes: [16,2] vs. [16,64]。我不应该用这个吗? @desertnaut

标签: python image-processing deep-learning computer-vision image-segmentation


【解决方案1】:

我想我找到了问题,

Dropout 层随机设置前一层的一些输出值,防止过拟合。 dropout 值必须始终小于 1,模型才能正确训练,并且最后一层有 dropout 是很不寻常的

所以尝试移除最后的 Dropout 层

这可能会有所帮助

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 2019-11-18
    • 2023-03-05
    • 1970-01-01
    • 1970-01-01
    • 2020-11-06
    • 2019-08-03
    • 1970-01-01
    相关资源
    最近更新 更多