【问题标题】:Getting error 'str' object has no attribute 'dtype' when exporting textsum model for TensorFlow Serving为 TensorFlow Serving 导出 textsum 模型时出现错误“str”对象没有属性“dtype”
【发布时间】:2017-10-22 13:54:14
【问题描述】:

我目前正在尝试使用 PREDICT SIGNATURE 导出 TF textsum 模型。我让 _Decode 从传入的测试文章字符串返回结果,然后将其传递给 buildTensorInfo。这实际上是一个正在返回的字符串。

现在,当我运行 textsum_export.py 逻辑来导出模型时,它到达了正在构建 TensorInfo 对象的地步,但是出现以下跟踪错误。我知道 PREDICT 签名通常与图像一起使用。这是问题吗?因为我正在使用字符串,所以我不能将它用于 Textsum 模型吗?

错误是:

Traceback (most recent call last):
  File "export_textsum.py", line 129, in Export
    tensor_info_outputs = tf.saved_model.utils.build_tensor_info(res)
  File "/usr/local/lib/python2.7/site-packages/tensorflow/python/saved_model/utils_impl.py", line 37, in build_tensor_info
    dtype_enum = dtypes.as_dtype(tensor.dtype).as_datatype_enum
AttributeError: 'str' object has no attribute 'dtype'

模型导出的TF会话如下:

with tf.Session(config = config) as sess:

                # Restore variables from training checkpoints.
                ckpt = tf.train.get_checkpoint_state(FLAGS.checkpoint_dir)
                if ckpt and ckpt.model_checkpoint_path:
                    saver.restore(sess, ckpt.model_checkpoint_path)
                    global_step = ckpt.model_checkpoint_path.split('/')[-1].split('-')[-1]
                    print('Successfully loaded model from %s at step=%s.' %
                        (ckpt.model_checkpoint_path, global_step))
                    res = decoder._Decode(saver, sess)

                    print("Decoder value {}".format(type(res)))
                else:
                    print('No checkpoint file found at %s' % FLAGS.checkpoint_dir)
                    return

                # Export model
                export_path = os.path.join(FLAGS.export_dir,str(FLAGS.export_version))
                print('Exporting trained model to %s' % export_path)


                #-------------------------------------------

                tensor_info_inputs = tf.saved_model.utils.build_tensor_info(serialized_tf_example)
                tensor_info_outputs = tf.saved_model.utils.build_tensor_info(res)

                prediction_signature = (
                    tf.saved_model.signature_def_utils.build_signature_def(
                        inputs={ tf.saved_model.signature_constants.PREDICT_INPUTS: tensor_info_inputs},
                        outputs={tf.saved_model.signature_constants.PREDICT_OUTPUTS:tensor_info_outputs},
                        method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
                        ))

                #----------------------------------

                legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
                builder = saved_model_builder.SavedModelBuilder(export_path)

                builder.add_meta_graph_and_variables(
                    sess=sess, 
                    tags=[tf.saved_model.tag_constants.SERVING],
                    signature_def_map={
                        'predict':prediction_signature,
                    },
                    legacy_init_op=legacy_init_op)
                builder.save()

                print('Successfully exported model to %s' % export_path)

【问题讨论】:

  • PREDICT 签名与张量一起工作,res_tensor = tf.convert_to_tensor(res)
  • 高拉夫你太棒了!那工作得很好。我似乎无法将此评论设置为答案,但您应该是获得荣誉的人。如果您可以在上面提供您的评论作为答案,我会接受。再次感谢!

标签: tensorflow tensorflow-serving


【解决方案1】:

PREDICT 签名与张量一起使用,如果 res 是 'str' 类型的 python 变量,那么 res_tensor 将是 dtype tf.string

res_tensor = tf.convert_to_tensor(res) 

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 2018-07-24
    • 2021-10-12
    • 2021-08-25
    • 1970-01-01
    • 2021-05-26
    • 1970-01-01
    • 2022-01-04
    • 2021-11-15
    相关资源
    最近更新 更多