【问题标题】:Keras model output information/log levelKeras模型输出信息/日志级别
【发布时间】:2017-09-28 18:23:48
【问题描述】:

我正在使用 Keras 构建神经网络模型:

model_keras = Sequential()
model_keras.add(Dense(4, input_dim=input_num, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model_keras.add(Dense(1, activation='linear',kernel_regularizer=regularizers.l2(0.01)))
sgd = optimizers.SGD(lr=0.01, clipnorm=0.5)
model_keras.compile(loss='mean_squared_error',  optimizer=sgd)
model_keras.fit(X_norm_train, y_norm_train, batch_size=20, epochs=100)

输出如下所示。我想知道是否有可能消除损失,比如每 10 个时期而不是每个时期?谢谢!

Epoch 1/200
20/20 [==============================] - 0s - loss: 0.2661
Epoch 2/200
20/20 [==============================] - 0s - loss: 0.2625
Epoch 3/200
20/20 [==============================] - 0s - loss: 0.2590
Epoch 4/200
20/20 [==============================] - 0s - loss: 0.2556
Epoch 5/200
20/20 [==============================] - 0s - loss: 0.2523
Epoch 6/200
20/20 [==============================] - 0s - loss: 0.2490
Epoch 7/200
20/20 [==============================] - 0s - loss: 0.2458
Epoch 8/200
20/20 [==============================] - 0s - loss: 0.2427
Epoch 9/200
20/20 [==============================] - 0s - loss: 0.2397
Epoch 10/200
20/20 [==============================] - 0s - loss: 0.2367
Epoch 11/200
20/20 [==============================] - 0s - loss: 0.2338
Epoch 12/200
20/20 [==============================] - 0s - loss: 0.2309
Epoch 13/200
20/20 [==============================] - 0s - loss: 0.2281
Epoch 14/200
20/20 [==============================] - 0s - loss: 0.2254
Epoch 15/200
20/20 [==============================] - 0s - loss: 0.2228
   :

【问题讨论】:

    标签: python-3.x keras


    【解决方案1】:

    不可能降低记录到标准输出的频率,但是,将 verbose=0 参数传递给 fit() 方法将完全关闭记录。

    由于在 Keras 的顺序模型中没有公开跨时期的循环,因此使用自定义频率收集标量变量摘要的一种方法是使用 Keras callbacks。特别是,您可以使用TensorBoard(假设您使用tensorflow 后端运行)或CSVLogger(任何后端)回调来收集任何标量变量摘要(在您的情况下是训练损失):

    from keras.callbacks import TensorBoard
    
    model_keras = Sequential()
    model_keras.add(Dense(4, input_dim=input_num, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
    model_keras.add(Dense(1, activation='linear',kernel_regularizer=regularizers.l2(0.01)))
    sgd = optimizers.SGD(lr=0.01, clipnorm=0.5)
    model_keras.compile(loss='mean_squared_error',  optimizer=sgd)
    
    TB = TensorBoard(histogram_freq=10, batch_size=20)
    
    model_keras.fit(X_norm_train, y_norm_train, batch_size=20, epochs=100, callbacks=[TB])
    

    设置 histogram_freq=10 将每 10 个 epoch 保存一次损失。

    编辑:将validation_data=(...) 传递给fit 方法也将允许检查验证级别指标。

    【讨论】:

    • 我尝试了使用 histogram_freq=10 的 TensorBoard,但它仍然打印每个 epoch 的损失......我可能需要修复的任何参数?
    • TensorBoard 不会改变屏幕上打印内容的频率,但会生成一个带有摘要的文件(即每 10 个 epoch 丢失一次),可以在浏览器中查看或使用 tensorboard.backend.event_processing.event_accumulator 访问在脚本中,然后绘制。
    • @Edamame 要停止打印每个时期的损失,请将最后一行更改为model_keras.fit(X_norm_train, y_norm_train, batch_size=20, epochs=100, verbose=0),唯一的区别是包含verbose=0
    【解决方案2】:

    创建 Keras 回调以减少日志行数。默认情况下,Keras 会在每个 epoch 打印日志。以下代码仅打印 10 条日志行,无论 epoch 数如何。

    class callback(tf.keras.callbacks.Callback):
      def on_epoch_end(this,Epoch,Logs):
        L = Logs["loss"];
    
        if Epoch%Lafte==Lafte-1: #Log after a number of epochs
          print(f"Average batch loss: {L:.9f}");
        if Epoch==Epochs-1:
          print(f"Fin-avg batch loss: {L:.9f}"); #Final average
    
    Model = model();
    Model.compile(...);
    
    Dsize  = ...   #Number of samples in training data
    Bsize  = ...   #Number of samples to process in 1 batch
    Steps  = 1000; #Number of batches to use to train
    Epochs = round(Steps/(Dsize/Bsize));
    Lafte  = round(Epochs/10); #Log 10 times only, regardless of num of Epochs
    if Lafte==0: Lafte=1;      #Avoid modulus by zero in on_epoch_end
    
    Model.fit(Data, epochs=Epochs, steps_per_epoch=round(Dsize/Bsize),
              callbacks=[callback()], verbose=0);
    

    【讨论】:

      猜你喜欢
      • 2014-12-25
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多