【问题标题】:Can i get the all output keras layers我可以获得所有输出 keras 层吗
【发布时间】:2020-03-08 00:26:57
【问题描述】:

我刚开始深度学习,我想实时获取每一层的输入/输出。我正在使用带有 tensorflow 2 和 python 3 的谷歌 colab。我试图获得这样的层,但由于某种我不明白的原因不起作用。任何帮助将不胜感激。

# Here are imports 

from __future__ import absolute_import, division, print_function, unicode_literals

try:
  # %tensorflow_version only exists in Colab.
  %tensorflow_version 2.x
except Exception:
  pass
import tensorflow as tf

from tensorflow.keras import datasets, layers, models
import matplotlib.pyplot as plt


from tensorflow.keras import backend as K



# I am using CIFAR10 dataset

(train_images, train_labels), (test_images, test_labels) = 
datasets.cifar10.load_data()

Normalize pixel values to be between 0 and 1
train_images, test_images = train_images / 255.0, test_images / 255.0

# Here is the model

model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(32, 32, 3)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(10, activation='softmax'))

# Compilation of the model 

model.compile(optimizer='adam',
          loss='sparse_categorical_crossentropy',
          metrics=['accuracy'])

history = model.fit(train_images, train_labels, epochs=10, 
                validation_data=(test_images, test_labels))
# Based on 
https://stackoverflow.com/questions/41711190/keras-how-to-get-the-output-of-each-layer

# I tried this 

tf.compat.v1.disable_eager_execution()
inp = model.input                                    # input placeholder
outputs = [layer.output for layer in model.layers]     # all layer outputs
functors = [K.function([inp, K.learning_phase()], [out]) for out in outputs]    # evaluation functions

Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = [func([test, 1.]) for func in functors]
print(layer_outs)


#The error appear at line 
functors = [K.function([inp, K.learning_phase()], [out]) for out in outputs] 


#I got this error message
Tensor Tensor("conv2d/Identity:0", shape=(None, 30, 30, 32), dtype=float32) is not an element of this graph.

【问题讨论】:

    标签: tensorflow keras deep-learning google-colaboratory keras-layer


    【解决方案1】:

    这个错误基本上告诉你你想在编译后更改图表。当你调用 compile 时,TF 会静态定义所有的操作。您必须将代码 sn-p 移动到您在编译方法上方定义functors 的位置。只需将最后几行换成这些:

    tf.compat.v1.disable_eager_execution()
    inp = model.input                                    # input placeholder
    outputs = [layer.output for layer in model.layers]     # all layer outputs
    functors = [K.function([inp, K.learning_phase()], [out]) for out in outputs]    # evaluation functions
    
    
    
    model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
    
    history = model.fit(train_images, train_labels, epochs=1, 
                    validation_data=(test_images, test_labels))
    
    #Testing
    input_shape = [1] + list(model.input_shape[1:])
    test = np.random.random(input_shape)
    layer_outs = [func([test, 1.]) for func in functors]
    print(layer_outs)
    

    【讨论】:

    • 我按照你说的做了,我得到了同样的错误 Tensor Tensor("conv2d/Identity:0", shape=(None, 30, 30, 32), dtype=float32) is not an element这张图。
    • 但是我有一个与此相关的问题。在训练执行期间​​是否有可能获得输出层?
    • 是的,你应该为那个使用回调。 keras.io/callbacks
    • 我试着去做,但它只是没有在我的脑海中点击。我创建了一个自定义回调,并尝试在批处理结束时显示输出层,但我仍然不知道如何使其工作。你有什么建议吗?
    • 请用代码创建一个新线程,在其中显示您的自定义回调。您可以链接我,我会尽力帮助您。但是,我建议将数据保存为 numpy 数组,并在训练后将它们可视化。
    猜你喜欢
    • 2020-12-27
    • 2017-07-25
    • 2020-07-16
    • 2017-06-02
    • 2013-01-01
    • 1970-01-01
    • 1970-01-01
    • 2011-10-01
    • 1970-01-01
    相关资源
    最近更新 更多