【问题标题】:How to save TensorRT graph generated from frozen inference graph?如何保存从冻结推理图生成的 TensorRT 图?
【发布时间】:2019-08-01 18:40:06
【问题描述】:

我使用以下脚本将我的 frozen_inference_graph 转换为 TensorRT 优化的:

import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt

with tf.Session() as sess:
    # First deserialize your frozen graph:
    with tf.gfile.GFile('frozen_inference_graph.pb', 'rb') as f:
        frozen_graph = tf.GraphDef()
        frozen_graph.ParseFromString(f.read())
    # Now you can create a TensorRT inference graph from your
    # frozen graph:
    converter = trt.TrtGraphConverter(
        input_graph_def=frozen_graph,
        nodes_blacklist=['outputs/Softmax']) #output nodes
    trt_graph = converter.convert()
    # Import the TensorRT graph into a new graph and run:
    output_node = tf.import_graph_def(
        trt_graph,
        return_elements=['outputs/Softmax'])
    sess.run(output_node)

我的问题是如何将优化后的图表保存到磁盘上,以便使用它来运行推理?

【问题讨论】:

    标签: python tensorflow tensorrt


    【解决方案1】:

    是的,您只需添加这两行:

    saved_model_dir_trt = "./tensorrt_model.trt"
    converter.save(saved_model_dir_trt)

    【讨论】:

      猜你喜欢
      • 2019-05-27
      • 2018-11-09
      • 2023-01-31
      • 1970-01-01
      • 2020-07-13
      • 2023-03-20
      • 1970-01-01
      • 1970-01-01
      • 2021-03-23
      相关资源
      最近更新 更多