【发布时间】:2020-01-26 15:19:49
【问题描述】:
我正在学习使用 Tensorboard -- Tensorflow 2.0。
特别是,我想实时监控学习曲线,并直观地检查和交流我的模型的架构。
下面我将提供一个可重现示例的代码。
我有三个问题:
虽然训练结束后我得到了学习曲线,但我不知道应该怎么做才能实时监控它们
我从 Tensorboard 得到的学习曲线与 history.history 的情节不符。事实上,它的反转很奇怪且难以解释。
我无法理解图表。我已经训练了一个顺序模型,其中包含 5 个密集层和中间的 dropout 层。 Tensorboard 向我展示的是其中包含更多元素的东西。
我的代码如下:
from keras.datasets import boston_housing
(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()
inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)
model.compile(optimizer = 'Adam', loss = 'mse')
logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)
history = model.fit(train_data, train_targets,
batch_size= 32,
epochs= 20,
validation_data=(test_data, test_targets),
shuffle=True,
callbacks=[tensorboard_callback ])
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.plot(history.history['val_loss'])
【问题讨论】:
标签: python-3.x tensorflow tensorboard