【问题标题】:How to get the output of the fully connected layer from CNN in Tensorflow?Tensorflow中如何从CNN获取全连接层的输出?
【发布时间】:2020-07-22 17:01:42
【问题描述】:

我想使用 tensorflow 从卷积神经网络的全连接层中提取 CNN 激活。在以下帖子中,一位用户提出了这个问题:

How to extract activation from CNN layers using tensorflow?

答案是这样的:

sess = tf.InteractiveSesssion()

full_connected = ....
value_of_fully_connected = sess.run(fully_connected,feed_dict={your_placeholders_and_values)

但是,在我的代码中,全连接层与tf.session() 是分开的。我有这个计算卷积的函数的代码:

def conv_net(x, weights, biases):  

    conv1 = conv2d(x, weights['wc1'], biases['bc1'])
    conv1 = maxpool2d(conv1, k=2)

    conv2 = conv2d(conv1, weights['wc2'], biases['bc2'])
    conv2 = maxpool2d(conv2, k=2)

    conv3 = conv2d(conv2, weights['wc3'], biases['bc3'])
    conv3 = maxpool2d(conv3, k=2)


    # Fully connected layer
    fc1 = tf.reshape(conv3, [-1, weights['wd1'].get_shape().as_list()[0]])
    fc1 = tf.add(tf.matmul(fc1, weights['wd1']), biases['bd1'])
    fc1 = tf.nn.relu(fc1)

    out = tf.add(tf.matmul(fc1, weights['out']), biases['out'])
    return out

然后是预测:

pred = conv_net(x, weights, biases)

cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=pred, labels=y))

optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)

这里是培训:

with tf.Session() as sess:
    sess.run(init) 
    train_loss = []
    test_loss = []
    train_accuracy = []
    test_accuracy = []
    summary_writer = tf.summary.FileWriter('./Output', sess.graph)
    for i in range(training_iters):
        for batch in range(len(train_X)//batch_size):
            batch_x = train_X[batch*batch_size:min((batch+1)*batch_size,len(train_X))]
            batch_y = train_y[batch*batch_size:min((batch+1)*batch_size,len(train_y))]    
            # Run optimization op (backprop).
                # Calculate batch loss and accuracy
            opt = sess.run(optimizer, feed_dict={x: batch_x,
                                                              y: batch_y})
            loss, acc = sess.run([cost, accuracy], feed_dict={x: batch_x,
                                                              y: batch_y})
        print("Iter " + str(i) + ", Loss= " + \
                      "{:.6f}".format(loss) + ", Training Accuracy= " + \
                      "{:.5f}".format(acc))
        print("Optimization Finished!")

        # Calculate accuracy for all 10000 mnist test images
        test_acc,valid_loss = sess.run([accuracy,cost], feed_dict={x: test_X,y : test_y})
        train_loss.append(loss)
        test_loss.append(valid_loss)
        train_accuracy.append(acc)
        test_accuracy.append(test_acc)
        print("Testing Accuracy:","{:.5f}".format(test_acc))
    summary_writer.close()

如您所见,全连接层位于函数 conv_net() 内部,而我似乎无法从 tf.session() 代码内部访问它。

我需要访问那个全连接层,这样我才能使用上面帖子中的答案。我该怎么做?

【问题讨论】:

    标签: python tensorflow conv-neural-network


    【解决方案1】:

    在 python 中,您可以返回函数的输出列表。所以,我会做这样的事情:

    def conv_net(x, weights, biases):  
    
        conv1 = conv2d(x, weights['wc1'], biases['bc1'])
        conv1 = maxpool2d(conv1, k=2)
    
        conv2 = conv2d(conv1, weights['wc2'], biases['bc2'])
        conv2 = maxpool2d(conv2, k=2)
    
        conv3 = conv2d(conv2, weights['wc3'], biases['bc3'])
        conv3 = maxpool2d(conv3, k=2)
    
    
        # Fully connected layer
        fc1 = tf.reshape(conv3, [-1, weights['wd1'].get_shape().as_list()[0]])
        fc1 = tf.add(tf.matmul(fc1, weights['wd1']), biases['bd1'])
        fc1 = tf.nn.relu(fc1)
    
        out = tf.add(tf.matmul(fc1, weights['out']), biases['out'])
        return [out,fc1]
    

    当您想获得输出时,您可以这样做:

    pred, fcn = conv_net(x, weights, biases)
    

    如果您想在会话中查看结果,请执行以下操作:

    fcn_evaluated = sess.run(fcn)
    print(fcn_evaluated)
    

    【讨论】:

      猜你喜欢
      • 2016-06-17
      • 1970-01-01
      • 2017-12-14
      • 1970-01-01
      • 2018-11-24
      • 2018-03-12
      • 2019-06-25
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多