【发布时间】:2020-08-16 12:49:14
【问题描述】:
我有一个用于 MNIST 数据集的 LeNet-300-100 密集神经网络,我想冻结前两个隐藏层中具有 300 和 100 个隐藏神经元的前两层。我只想训练输出层。我必须这样做的代码如下:
from tensorflow import keras
inner_model = keras.Sequential(
[
keras.Input(shape=(1024,)),
keras.layers.Dense(300, activation="relu", kernel_initializer = tf.initializers.GlorotNormal()),
keras.layers.Dense(100, activation="relu", kernel_initializer = tf.initializers.GlorotNormal()),
]
)
model_mnist = keras.Sequential(
[keras.Input(shape=(1024,)), inner_model, keras.layers.Dense(10, activation="softmax"),]
)
# model_mnist.trainable = True # Freeze the outer model
# Freeze the inner model-
inner_model.trainable = False
# Sanity check-
inner_model.trainable, model_mnist.trainable
# (False, True)
# Compile NN-
model_mnist.compile(
loss=tf.keras.losses.categorical_crossentropy,
# optimizer='adam',
optimizer=tf.keras.optimizers.Adam(lr = 0.0012),
metrics=['accuracy'])
但是,这段代码似乎并没有冻结前两个隐藏层,它们也在学习。我做错了什么?
谢谢!
【问题讨论】:
标签: neural-network tensorflow2.0