【发布时间】:2023-03-13 04:15:01
【问题描述】:
假设我有一个模型
from tensorflow.keras.applications import DenseNet201
base_model = DenseNet201(input_tensor=Input(shape=basic_shape))
model = Sequential()
model.add(base_model)
model.add(Dense(400))
model.add(BatchNormalization())
model.add(ReLU())
model.add(Dense(50, activation='softmax'))
model.save('test.hdf5')
然后我加载保存的模型并尝试使最后 40 层 DenseNet201 可训练,前 161 层 - 不可训练:
saved_model = load_model('test.hdf5')
cnt = 44
saved_model.trainable = False
while cnt > 0:
saved_model.layers[-cnt].trainable = True
cnt -= 1
但这实际上不起作用,因为DenseNet201 被确定为单层,我只是得到索引超出范围错误。
Layer (type) Output Shape Param #
=================================================================
densenet201 (Functional) (None, 1000) 20242984
_________________________________________________________________
dense (Dense) (None, 400) 400400
_________________________________________________________________
batch_normalization (BatchNo (None, 400) 1600
_________________________________________________________________
re_lu (ReLU) (None, 400) 0
_________________________________________________________________
dense_1 (Dense) (None, 50) 20050
=================================================================
Total params: 20,665,034
Trainable params: 4,490,090
Non-trainable params: 16,174,944
问题是我如何才能真正使 DenseNet 的前 161 层不可训练,而后 40 层可在加载的模型上训练?
【问题讨论】:
标签: python machine-learning keras conv-neural-network keras-layer