【问题标题】:Trying to get output of each layer while predicting test image, however getting an error Input tensors to a Functional must come from `tf.keras.Input`试图在预测测试图像时获得每一层的输出,但是得到一个错误输入张量到一个函数必须来自`tf.keras.Input`
【发布时间】:2022-04-19 13:31:55
【问题描述】:

我正在尝试使用 TensorFlow 和 Keras 在 Python 中进行图像识别。请查看下面我提供的链接中的代码,因为我遇到了相同代码的另一个问题,现在已修复。

getting error while predicting a test image - cannot reshape array of size

我按照Keras, How to get the output of each layer? 的帖子获取每一层的输出并使用下面的代码

from keras import backend as K 
inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers[:12]]        # all layer outputs except first (input) layer
functor = K.function([inp, K.learning_phase()], outputs )   # evaluation function

# Testing
test = numpy.random.random(input_shape)[np.newaxis,...]
layer_outs = functor([test, 1.])
print (layer_outs)

但是,我收到以下错误:

ValueError: Input tensors to a Functional must come from `tf.keras.Input`. Received: 0 (missing previous layer metadata).

有人可以帮助获取我正在预测的图像的每一层的输出,该图像是新图像,而不是网络训练过的图像的一部分吗?

【问题讨论】:

    标签: python tensorflow image-processing keras deep-learning


    【解决方案1】:

    如果您可以将模型定义如下,那么您将不会收到上述特定错误:

    model = Sequential([
                        tf.keras.Input(shape=(X_train.shape[1:])),
                        Conv2D(32, (3, 3), padding='same', activation='relu'),
                        Conv2D(32, (3, 3),  activation='relu', padding='same'),
                        Dropout(0.2),
                        BatchNormalization(),
                        Conv2D(64, (3, 3), padding='same', name='test1', activation='relu'),
                        MaxPooling2D(pool_size=(2, 2)),
                        Dropout(0.2),
                        BatchNormalization(),
                        Conv2D(64, (3, 3), padding='same', name='test2', activation='relu'),
                        MaxPooling2D(pool_size=(2, 2)),
                        Dropout(0.2),
                        BatchNormalization(),
                        Conv2D(128, (3, 3), padding='same', name='test3',activation='relu'),
                        Dropout(0.2),
                        BatchNormalization(),
                        Flatten(),
                        Dropout(0.2),
                        Dense(256, kernel_constraint=maxnorm(3),activation='relu'),
                        Dropout(0.2),
                        BatchNormalization(),
                        Dense(128, kernel_constraint=maxnorm(3),activation='relu'),
                        Dropout(0.2),
                        BatchNormalization(),
                        Dense(class_num,activation='softmax')
    ])
    

    现在,要从模型中获取每个定义层的输出值,请检查:

    from tensorflow.keras import backend as K
    
    for index, layer in enumerate(model.layers):
        func = K.function([model.get_layer(index=0).input], layer.output)
        layerOutput = func([X_test]) 
        print(layerOutput.shape) # to check each layer output, remove .shape
    

    输出:

    (10000, 28, 28, 32)
    (10000, 28, 28, 32)
    (10000, 28, 28, 32)
    (10000, 28, 28, 32)
    (10000, 28, 28, 64)
    (10000, 14, 14, 64)
    (10000, 14, 14, 64)
    (10000, 14, 14, 64)
    (10000, 14, 14, 64)
    (10000, 7, 7, 64)
    (10000, 7, 7, 64)
    (10000, 7, 7, 64)
    (10000, 7, 7, 128)
    (10000, 7, 7, 128)
    (10000, 7, 7, 128)
    (10000, 6272)
    (10000, 6272)
    (10000, 256)
    (10000, 256)
    (10000, 256)
    (10000, 128)
    (10000, 128)
    (10000, 128)
    (10000, 10)
    

    要提取特定层的输出值,您可以使用以下代码:

    feature_extractor = keras.Model(
        inputs=model.inputs,
        outputs=model.get_layer(name="test1").output)
    
    features = feature_extractor(X_test)
    features.shape
    features
    

    输出:

    TensorShape([10000, 28, 28, 64])
    
    <tf.Tensor: shape=(10000, 28, 28, 64), dtype=float32, numpy=
    array([[[[0.00000000e+00, 0.00000000e+00, 0.00000000e+00, ...,
              0.00000000e+00, 0.00000000e+00, 0.00000000e+00],
             [0.00000000e+00, 0.00000000e+00, 0.00000000e+00, ..
    .
    .
    .
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2021-03-27
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2019-10-07
      • 2017-03-08
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多