【问题标题】:Tensorflow Serving basic exampleTensorFlow Serving 基本示例
【发布时间】:2017-02-04 01:40:59
【问题描述】:

我正在研究 Tensorflow serving_basic 示例:

https://tensorflow.github.io/serving/serving_basic

设置

关注:https://tensorflow.github.io/serving/setup#prerequisites

在基于 ubuntu:latest 的 docker 容器中,我已安装:

巴泽尔:

echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key
sudo apt-get update && sudo apt-get install bazel
sudo apt-get upgrade bazel

grpcio:

pip install grpcio

所有包:

sudo apt-get update && sudo apt-get install -y         build-essential         curl         libcurl3-dev         git         libfreetype6-dev         libpng12-dev         libzmq3-dev         pkg-config         python-dev         python-numpy         python-pip         software-properties-common         swig         zip         zlib1g-dev

张量流服务:

git clone --recurse-submodules https://github.com/tensorflow/serving
cd serving
cd tensorflow
./configure
cd ..

我已经用 bazel 构建了源代码,并且所有测试都成功运行:

bazel build tensorflow_serving/...
bazel test tensorflow_serving/...

我可以通过以下方式成功导出 mnist 模型:

bazel-bin/tensorflow_serving/example/mnist_export /tmp/mnist_model

我可以为导出的模型提供服务:

bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/

问题

当我测试服务器并尝试将客户端连接到模型服务器时:

bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000

我看到了这个输出:

root@dc3ea7993fa9:~/serving# bazel-bin/tensorflow_serving/example/mnist_client --num_tests=2 --server=localhost:9000
Extracting /tmp/train-images-idx3-ubyte.gz
Extracting /tmp/train-labels-idx1-ubyte.gz
Extracting /tmp/t10k-images-idx3-ubyte.gz
Extracting /tmp/t10k-labels-idx1-ubyte.gz
AbortionError(code=StatusCode.NOT_FOUND, details="FeedInputs: unable to find feed output images")
AbortionError(code=StatusCode.NOT_FOUND, details="FeedInputs: unable to find feed output images")

Inference error rate is: 100.0%

【问题讨论】:

    标签: tensorflow


    【解决方案1】:

    “--use_saved_model”模型标志设置为默认“true”;启动服务器时使用 --use_saved_model=false。这应该有效:

    bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --use_saved_model=false --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/
    

    【讨论】:

      【解决方案2】:

      我在tensorflow github 上提到过这个,解决办法是把原来创建的模型去掉。如果你遇到这个,运行 rm -rf /tmp/mnist_model 并重建它

      【讨论】:

        猜你喜欢
        • 2017-10-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 2017-07-15
        • 2018-01-14
        • 2017-04-14
        • 2017-10-24
        相关资源
        最近更新 更多