【问题标题】:How to get precision and recall, for a keras model?对于 keras 模型,如何获得精度和召回率?
【发布时间】:2020-07-17 01:23:23
【问题描述】:

我想查看我的模型用于二值图像分类的精度和召回率,但我可以找到如何做到这一点

这是我的代码


x = base_model.output

x = tf.keras.layers.GlobalAveragePooling2D()(x)

x = tf.keras.layers.Dense(1024, activation='relu')(x) 
x = tf.keras.layers.Dense(1024, activation='relu')(x)
x = tf.keras.layers.Dense(512, activation='relu')(x)
preds = tf.keras.layers.Dense(2, activation='softmax')(x)

model = tf.keras.Model(inputs = base_model.input, outputs = preds)

for layer in model.layers[:175]:
  layer.trainable = False 

for layer in model.layers[175:]:
  layer.trainable = True  

model.compile(optimizer='Adam', loss='categorical_crossentropy', metrics=['accuracy'])

history = model.fit_generator(generator=train_generator,
                              epochs=20,
                              steps_per_epoch=step_size_train,
                              validation_data = test_generator,
                              validation_steps=step_size_test)```

【问题讨论】:

    标签: python keras deep-learning


    【解决方案1】:

    如果您想要训练期间的精确度和召回率,那么您可以在模型编译期间将精确度和召回率指标添加到metrics 列表中,如下所示

    model.compile(optimizer='Adam', loss='categorical_crossentropy',
                  metrics=['accuracy', 
                           tf.keras.metrics.Precision(),
                           tf.keras.metrics.Recall()])
    

    示例

    input = tf.keras.layers.Input(8)
    x = tf.keras.layers.Dense(4, activation='relu')(input) 
    output = tf.keras.layers.Dense(2, activation='softmax')(x)
    
    model = tf.keras.Model(inputs = input, outputs = output)
    model.compile(optimizer='Adam', loss='categorical_crossentropy',
                  metrics=['accuracy', 
                           tf.keras.metrics.Precision(),
                           tf.keras.metrics.Recall()])
    
    X = np.random.randn(100,8)
    y = np.random.randint(0,2, (100, 2))
    
    model.fit(X, y, epochs=10)
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2017-08-21
      • 2018-07-04
      • 2021-03-04
      • 2016-01-09
      • 1970-01-01
      • 2022-01-06
      • 2023-04-07
      • 2023-01-05
      相关资源
      最近更新 更多