1+ ---
2+ ---
3+
14# Serving a TensorFlow Model
25
36This tutorial shows you how to use TensorFlow Serving components to export a
47trained TensorFlow model and use the standard tensorflow_model_server to serve
58it. If you are already familiar with TensorFlow Serving, and you want to know
69more about how the server internals work, see the
7- [ TensorFlow Serving advanced tutorial] ( serving_advanced.md ) .
10+ [ TensorFlow Serving advanced tutorial] ( serving_advanced ) .
811
912This tutorial uses the simple Softmax Regression model introduced in the
1013TensorFlow tutorial for handwritten image (MNIST data) classification. If you
@@ -25,11 +28,11 @@ The code for this tutorial consists of two parts:
2528 [ gRPC] ( http://www.grpc.io ) service for serving them.
2629
2730Before getting started, please complete the
28- [ prerequisites] ( setup.md #prerequisites ) .
31+ [ prerequisites] ( setup#prerequisites ) .
2932
3033Note: All ` bazel build ` commands below use the standard ` -c opt ` flag. To
3134further optimize the build, refer to the
32- [ instructions here] ( setup.md #optimized-build ) .
35+ [ instructions here] ( setup#optimized-build ) .
3336
3437## Train And Export TensorFlow Model
3538
@@ -141,7 +144,7 @@ allows the user to refer to these tensors with their logical names when
141144running inference.
142145
143146Note: In addition to the description above, documentation related to signature
144- def structure and how to set up them up can be found [ here] ( signature_defs.md ) .
147+ def structure and how to set up them up can be found [ here] ( signature_defs ) .
145148
146149Let's run it!
147150
@@ -154,7 +157,7 @@ $>rm -rf /tmp/mnist_model
154157If you would like to install the ` tensorflow ` and ` tensorflow-serving-api ` PIP
155158packages, you can run all Python code (export and client) using a simple
156159` python ` command. To install the PIP package, follow the
157- [ instructions here] ( setup.md #tensorflow-serving-python-api-pip-package ) .
160+ [ instructions here] ( setup#tensorflow-serving-python-api-pip-package ) .
158161It's also possible to
159162use Bazel to build the necessary dependencies and run all code without
160163installing those packages. The rest of the codelab will have instructions for
@@ -216,7 +219,7 @@ $>bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000
216219```
217220
218221If you'd prefer to skip compilation and install using apt-get, follow the
219- [ instructions here] ( setup.md #installing-using-apt-get ) . Then run the server with
222+ [ instructions here] ( setup#installing-using-apt-get ) . Then run the server with
220223the following command:
221224
222225``` shell
0 commit comments