Skip to content

Commit 2228da2

Browse files
author
Chris Olston
committed
Push documentation changes to gh-pages.
1 parent 60a7892 commit 2228da2

File tree

10 files changed

+57
-27
lines changed

10 files changed

+57
-27
lines changed

architecture_overview.md

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# Architecture Overview
25

36
TensorFlow Serving is a flexible, high-performance serving system for machine
@@ -98,7 +101,7 @@ versions to the Manager, it supercedes the previous list for that servable
98101
stream. The Manager unloads any previously loaded versions that no longer
99102
appear in the list.
100103

101-
See the [advanced tutorial](serving_advanced.md) to see how version loading
104+
See the [advanced tutorial](serving_advanced) to see how version loading
102105
works in practice.
103106

104107
### Managers
@@ -198,7 +201,7 @@ easy & fast to create new sources. For example, TensorFlow Serving includes a
198201
utility to wrap polling behavior around a simple source. Sources are closely
199202
related to Loaders for specific algorithms and data hosting servables.
200203

201-
See the [Custom Source](custom_source.md) document for more about how to create
204+
See the [Custom Source](custom_source) document for more about how to create
202205
a custom Source.
203206

204207
### Loaders
@@ -209,7 +212,7 @@ new Loader in order to load, provide access to, and unload an instance of a
209212
new type of servable machine learning model. We anticipate creating Loaders
210213
for lookup tables and additional algorithms.
211214

212-
See the [Custom Servable](custom_servable.md) document to learn how to create a
215+
See the [Custom Servable](custom_servable) document to learn how to create a
213216
custom servable.
214217

215218
### Batcher
@@ -225,4 +228,4 @@ for more information.
225228
## Next Steps
226229

227230
To get started with TensorFlow Serving, try the
228-
[Basic Tutorial](serving_basic.md).
231+
[Basic Tutorial](serving_basic).

custom_servable.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# Creating a new kind of servable
25

36
This document explains how to extend TensorFlow Serving with a new kind of
@@ -25,7 +28,7 @@ define methods for loading, accessing and unloading your type of servable. The
2528
data from which the servable is loaded can come from anywhere, but it is common
2629
for it to come from a storage-system path. Let us assume that is the case for
2730
`YourServable`. Let us further assume you already have a `Source<StoragePath>`
28-
that you are happy with (if not, see the [Custom Source](custom_source.md)
31+
that you are happy with (if not, see the [Custom Source](custom_source)
2932
document).
3033

3134
In addition to your `Loader`, you will need to define a `SourceAdapter` that

custom_source.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# Creating a module that discovers new servable paths
25

36
This document explains how to extend TensorFlow Serving to monitor different
@@ -24,7 +27,7 @@ Of course, whatever kind of data your source emits (whether it is POSIX paths,
2427
Google Cloud Storage paths, or RPC handles), there needs to be accompanying
2528
module(s) that are able to load servables based on that. Such modules are called
2629
`SourceAdapters`. Creating a custom one is described in the
27-
[Custom Servable](custom_servable.md) document. TensorFlow Serving
30+
[Custom Servable](custom_servable) document. TensorFlow Serving
2831
comes with one for instantiating TensorFlow sessions based on paths
2932
in file systems that TensorFlow supports. One can add support for
3033
additional file systems to TensorFlow by extending the `RandomAccessFile`

docker.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# Using TensorFlow Serving via Docker
25

36
This directory contains Dockerfiles to make it easy to get up and running with

index.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# Introduction
25

36
TensorFlow Serving is a flexible, high-performance serving system for machine
@@ -9,9 +12,9 @@ types of models and data.
912

1013
To get started with TensorFlow Serving:
1114

12-
* Read the [overview](architecture_overview.md)
13-
* [Set up](setup.md) your environment
14-
* Do the [basic tutorial](serving_basic.md)
15+
* Read the [overview](architecture_overview)
16+
* [Set up](setup) your environment
17+
* Do the [basic tutorial](serving_basic)
1518

1619

1720

serving_advanced.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,13 @@
1+
---
2+
---
3+
14
# Building Standard TensorFlow ModelServer
25

36
This tutorial shows you how to use TensorFlow Serving components to build the
47
standard TensorFlow ModelServer that dynamically discovers and serves new
58
versions of a trained TensorFlow model. If you just want to use the standard
69
server to serve your models, see
7-
[TensorFlow Serving basic tutorial](serving_basic.md).
10+
[TensorFlow Serving basic tutorial](serving_basic).
811

912
This tutorial uses the simple Softmax Regression model introduced in the
1013
TensorFlow tutorial for handwritten image (MNIST data) classification. If you
@@ -32,11 +35,11 @@ This tutorial steps through the following tasks:
3235
5. Run and test the service.
3336

3437
Before getting started, please complete the
35-
[prerequisites](setup.md#prerequisites).
38+
[prerequisites](setup#prerequisites).
3639

3740
Note: All `bazel build` commands below use the standard `-c opt` flag. To
3841
further optimize the build, refer to the
39-
[instructions here](setup.md#optimized-build).
42+
[instructions here](setup#optimized-build).
4043

4144
## Train And Export TensorFlow Model
4245

@@ -60,7 +63,7 @@ $>bazel-bin/tensorflow_serving/example/mnist_saved_model --training_iteration=20
6063
```
6164

6265
As you can see in `mnist_saved_model.py`, the training and exporting is done the
63-
same way it is in the [TensorFlow Serving basic tutorial](serving_basic.md). For
66+
same way it is in the [TensorFlow Serving basic tutorial](serving_basic). For
6467
demonstration purposes, you're intentionally dialing down the training
6568
iterations for the first run and exporting it as v1, while training it normally
6669
for the second run and exporting it as v2 to the same parent directory -- as we
@@ -165,8 +168,8 @@ that monitors cloud storage instead of local storage, or you could build a
165168
version policy plugin that does version transition in a different way -- in
166169
fact, you could even build a custom model plugin that serves non-TensorFlow
167170
models. These topics are out of scope for this tutorial. However, you can refer
168-
to the [custom source](custom_source.md) and [custom servable]
169-
(custom_servable.md) tutorials for more information.
171+
to the [custom source](custom_source) and [custom servable]
172+
(custom_servable) tutorials for more information.
170173
171174
## Batching
172175

serving_basic.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,13 @@
1+
---
2+
---
3+
14
# Serving a TensorFlow Model
25

36
This tutorial shows you how to use TensorFlow Serving components to export a
47
trained TensorFlow model and use the standard tensorflow_model_server to serve
58
it. If you are already familiar with TensorFlow Serving, and you want to know
69
more about how the server internals work, see the
7-
[TensorFlow Serving advanced tutorial](serving_advanced.md).
10+
[TensorFlow Serving advanced tutorial](serving_advanced).
811

912
This tutorial uses the simple Softmax Regression model introduced in the
1013
TensorFlow tutorial for handwritten image (MNIST data) classification. If you
@@ -25,11 +28,11 @@ The code for this tutorial consists of two parts:
2528
[gRPC](http://www.grpc.io) service for serving them.
2629

2730
Before getting started, please complete the
28-
[prerequisites](setup.md#prerequisites).
31+
[prerequisites](setup#prerequisites).
2932

3033
Note: All `bazel build` commands below use the standard `-c opt` flag. To
3134
further optimize the build, refer to the
32-
[instructions here](setup.md#optimized-build).
35+
[instructions here](setup#optimized-build).
3336

3437
## Train And Export TensorFlow Model
3538

@@ -141,7 +144,7 @@ allows the user to refer to these tensors with their logical names when
141144
running inference.
142145

143146
Note: In addition to the description above, documentation related to signature
144-
def structure and how to set up them up can be found [here](signature_defs.md).
147+
def structure and how to set up them up can be found [here](signature_defs).
145148

146149
Let's run it!
147150

@@ -154,7 +157,7 @@ $>rm -rf /tmp/mnist_model
154157
If you would like to install the `tensorflow` and `tensorflow-serving-api` PIP
155158
packages, you can run all Python code (export and client) using a simple
156159
`python` command. To install the PIP package, follow the
157-
[instructions here](setup.md#tensorflow-serving-python-api-pip-package).
160+
[instructions here](setup#tensorflow-serving-python-api-pip-package).
158161
It's also possible to
159162
use Bazel to build the necessary dependencies and run all code without
160163
installing those packages. The rest of the codelab will have instructions for
@@ -216,7 +219,7 @@ $>bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000
216219
```
217220

218221
If you'd prefer to skip compilation and install using apt-get, follow the
219-
[instructions here](setup.md#installing-using-apt-get). Then run the server with
222+
[instructions here](setup#installing-using-apt-get). Then run the server with
220223
the following command:
221224

222225
```shell

serving_inception.md

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,15 @@
1+
---
2+
---
3+
14
# Serving Inception Model with TensorFlow Serving and Kubernetes
25

36
This tutorial shows how to use TensorFlow Serving components running in Docker
47
containers to serve the TensorFlow Inception model and how to deploy the
58
serving cluster with Kubernetes.
69

710
To learn more about TensorFlow Serving, we recommend
8-
[TensorFlow Serving basic tutorial](serving_basic.md) and
9-
[TensorFlow Serving advanced tutorial](serving_advanced.md).
11+
[TensorFlow Serving basic tutorial](serving_basic) and
12+
[TensorFlow Serving advanced tutorial](serving_advanced).
1013

1114
To learn more about TensorFlow Inception model, we recommend
1215
[Inception in TensorFlow](https://github.com/tensorflow/models/tree/master/inception).
@@ -19,7 +22,7 @@ To learn more about TensorFlow Inception model, we recommend
1922

2023
## Part 0: Create a Docker image
2124

22-
Please refer to [Using TensorFlow Serving via Docker](docker.md) for details
25+
Please refer to [Using TensorFlow Serving via Docker](docker) for details
2326
about building a TensorFlow Serving Docker image.
2427

2528
### Run container
@@ -37,7 +40,7 @@ $ docker run --name=inception_container -it $USER/tensorflow-serving-devel
3740

3841
Note: All `bazel build` commands below use the standard `-c opt` flag. To
3942
further optimize the build, refer to the
40-
[instructions here](setup.md#optimized-build).
43+
[instructions here](setup#optimized-build).
4144

4245
In the running container, we clone, configure and build TensorFlow Serving
4346
example code.
@@ -51,7 +54,7 @@ root@c97d8e820ced:/serving# bazel build -c opt tensorflow_serving/example/...
5154
```
5255

5356
Next we can either install a TensorFlow ModelServer with apt-get using the
54-
[instructions here](setup.md#installing-using-apt-get), or build a ModelServer
57+
[instructions here](setup#installing-using-apt-get), or build a ModelServer
5558
binary using:
5659

5760
```shell

setup.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# Installation
25

36
## Prerequisites
@@ -170,7 +173,7 @@ To test your installation, execute:
170173
bazel test -c opt tensorflow_serving/...
171174
```
172175

173-
See the [basic tutorial](serving_basic.md) and [advanced tutorial](serving_advanced.md)
176+
See the [basic tutorial](serving_basic) and [advanced tutorial](serving_advanced)
174177
for more in-depth examples of running TensorFlow Serving.
175178

176179
### Optimized build

signature_defs.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,6 @@
1+
---
2+
---
3+
14
# SignatureDefs in SavedModel for TensorFlow Serving
25

36
## Objective

0 commit comments

Comments
 (0)