【问题标题】:Tensorflow signature output placeholderTensorFlow 签名输出占位符
【发布时间】:2019-01-23 22:30:22
【问题描述】:

我正在尝试导出一个 Tensorflow 模型,以便可以在 Tensorflow Serving 中使用它。这是我使用的脚本:

import os
import tensorflow as tf

trained_checkpoint_prefix = '/home/ubuntu/checkpoint'
export_dir = os.path.join('m', '0')

loaded_graph = tf.Graph()
config=tf.ConfigProto(allow_soft_placement=True)
with tf.Session(graph=loaded_graph, config=config) as sess:
    # Restore from checkpoint
    loader = tf.train.import_meta_graph(trained_checkpoint_prefix + 'file.meta')
    loader.restore(sess, tf.train.latest_checkpoint(trained_checkpoint_prefix))

    # Create SavedModelBuilder class
    # defines where the model will be exported
    export_path_base = "/home/ubuntu/m"
    export_path = os.path.join(
        tf.compat.as_bytes(export_path_base),
        tf.compat.as_bytes(str(0)))
    print('Exporting trained model to', export_path)
    builder = tf.saved_model.builder.SavedModelBuilder(export_path)

    batch_shape = (20, 256, 256, 3)
    input_tensor = tf.placeholder(tf.float32, shape=batch_shape, name="X_content")
    predictions_tf = tf.placeholder(tf.float32, shape=batch_shape, name='Y_output')

    tensor_info_input = tf.saved_model.utils.build_tensor_info(input_tensor)
    tensor_info_output = tf.saved_model.utils.build_tensor_info(predictions_tf)

    prediction_signature = (
        tf.saved_model.signature_def_utils.build_signature_def(
            inputs={'image': tensor_info_input},
            outputs={'output': tensor_info_output},
            method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))

    builder.add_meta_graph_and_variables(
        sess, [tf.saved_model.tag_constants.SERVING],
        signature_def_map={
            'style_image':
                prediction_signature,
        })

    builder.save(as_text=True)

主要问题是输出签名(predictions_tf)。在这种情况下,将其设置为 placeholder,我收到一条错误消息,指出从 gRPC 调用模型时必须设置它的值。我应该改用什么?

我试过了

predictions_tf = tf.Variable(0, dtype=tf.float32, name="Y_output")

predictions_tf = tf.TensorInfo(dtype=tf.float32)
predictions_tf.name = "Y_output"
predictions_tf.dtype = tf.float32

【问题讨论】:

    标签: python tensorflow tensorflow-serving


    【解决方案1】:

    我可能误解了你想要做什么,但在这里你基本上创建一个新的placeholder 用于输入和一个新的placeholder 用于输出。

    我认为你应该做的是,一旦你加载了模型,你必须在变量 input tensorprediction_tf获取模型的输入和输出张量@使用例如

    input_tensor=loaded_graph.get_tensor_by_name('the_name_in_the_loaded_graph:0')
    prediction_tf=loaded_graph.get_tensor_by_name('the_pred_name_in_the_loaded_graph:0')
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2018-08-24
      • 2017-01-02
      • 2017-10-12
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多