【问题标题】:Keras binary classification squash output to zero/oneKeras二进制分类壁球输出为零/一
【发布时间】:2023-03-13 03:29:01
【问题描述】:

我有一个前馈 DNN 模型,它有几个层来执行二进制分类。输出层是 1 个 sigmoid 单元和损失函数 binary_crossentropy。作为预测,我期望一个带有零/一的向量。为此,我将预测四舍五入并解开它们。然后我使用 sklearn 分数函数来计算(f1score、rocauc、precision、recall、mcc)。问题是我得到的预测向量与我假装的单热编码不匹配。虽然如果我使用 mse 损失函数,它会像假装一样工作。

=> 模型创建函数:

    def create_DNN_model(self, verbose=True):
        print("Creating DNN model")
        fundamental_parameters = ['dropout', 'output_activation', 'optimization', 'learning_rate',
                              'units_in_input_layer',
                              'units_in_hidden_layers', 'nb_epoch', 'batch_size']
        for param in fundamental_parameters:
            if self.parameters[param] == None:
                print("Parameter not set: " + param)
                return
        self.print_parameter_values()
        model = Sequential()
        # Input layer
        model.add(Dense(self.parameters['units_in_input_layer'], input_dim=self.feature_number, activation='relu'))
        model.add(BatchNormalization())
        model.add(Dropout(self.parameters['dropout']))
        # constructing all hidden layers
        for layer in self.parameters['units_in_hidden_layers']:
            model.add(Dense(layer, activation='relu'))
            model.add(BatchNormalization())
            model.add(Dropout(self.parameters['dropout']))
        # constructing the final layer
        model.add(Dense(1))
        model.add(Activation(self.parameters['output_activation']))
        if self.parameters['optimization'] == 'SGD':
            optim = SGD()
            optim.lr.set_value(self.parameters['learning_rate'])
        elif self.parameters['optimization'] == 'RMSprop':
            optim = RMSprop()
            optim.lr.set_value(self.parameters['learning_rate'])
        elif self.parameters['optimization'] == 'Adam':
            optim = Adam()
        elif self.parameters['optimization'] == 'Adadelta':
            optim = Adadelta()
        model.add(BatchNormalization())
        model.compile(loss='binary_crossentropy', optimizer=optim, metrics=[matthews_correlation])
        if self.verbose == 1: str(model.summary())
        print("DNN model sucessfully created")
        return model

=> 评估函数:

    def evaluate_model(self, X_test, y_test):
        print("Evaluating model with hold out test set.")
        y_pred = self.model.predict(X_test)
        y_pred = [float(np.round(x)) for x in y_pred]
        y_pred = np.ravel(y_pred)
        scores = dict()
        scores['roc_auc'] = roc_auc_score(y_test, y_pred)
        scores['accuracy'] = accuracy_score(y_test, y_pred)
        scores['f1_score'] = f1_score(y_test, y_pred)
        scores['mcc'] = matthews_corrcoef(y_test, y_pred)
        scores['precision'] = precision_score(y_test, y_pred)
        scores['recall'] = recall_score(y_test, y_pred)
        scores['log_loss'] = log_loss(y_test, y_pred)
        for metric, score in scores.items():
            print(metric + ': ' + str(score))
        return scores

=> 预测向量'y_pred':

[-1. -1.  2. -0.  2. -1. -1. -1.  2. -1. -1.  2. -1.  2. -1.  2. -1. -1.  2. -1.  2. -1. -1.  2. -1.  2.  2.  2. -1. -1.  2.  2.  2.  2. -1. -1. 2.  2.  2. -1.  2.  2. -1.  2. -1. -1. -1.  1. -1. -1. -1.]

提前致谢。

【问题讨论】:

  • 您在输出层使用线性激活(默认),而您应该使用 sigmoid。应该有帮助。
  • 你是绝对正确的。非常感谢。
  • 我很高兴它成功了,我不确定它是否足够。我会把它作为答案。

标签: python-3.x machine-learning deep-learning keras theano


【解决方案1】:

您在输出层使用线性激活(默认),而您应该使用 sigmoid。

【讨论】:

    猜你喜欢
    • 2021-05-02
    • 2020-07-20
    • 1970-01-01
    • 1970-01-01
    • 2018-04-09
    • 2020-12-20
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多