Skip to content

Commit ff796b1

Browse files
lambertatensorflower-gardener
authored andcommitted
Fix book.yaml
Remove 'via' PiperOrigin-RevId: 220486241
1 parent 1e417fb commit ff796b1

File tree

5 files changed

+8
-12
lines changed

5 files changed

+8
-12
lines changed

tensorflow_serving/g3doc/_book.yaml

-4
Original file line numberDiff line numberDiff line change
@@ -25,12 +25,8 @@ upper_tabs:
2525
path: /serving/setup
2626
- title: Serve a TensorFlow model
2727
path: /serving/serving_basic
28-
- title: REST API
29-
path: /serving/api_rest
3028
- title: Build a TensorFlow ModelServer
3129
path: /serving/serving_advanced
32-
- title: Use TensorFlow Serving with Docker
33-
path: /serving/docker
3430
- title: Use TensorFlow Serving with Kubernetes
3531
path: /serving/serving_kubernetes
3632
- title: Create a new kind of servable

tensorflow_serving/g3doc/custom_servable.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ document).
3030

3131
In addition to your `Loader`, you will need to define a `SourceAdapter` that
3232
instantiates a `Loader` from a given storage path. Most simple use-cases can
33-
specify the two objects concisely via the `SimpleLoaderSourceAdapter` class
33+
specify the two objects concisely with the `SimpleLoaderSourceAdapter` class
3434
(in `core/simple_loader.h`). Advanced use-cases may opt to specify `Loader` and
3535
`SourceAdapter` classes separately using the lower-level APIs, e.g. if the
3636
`SourceAdapter` needs to retain some state, and/or if state needs to be shared

tensorflow_serving/g3doc/docker.md

+5-5
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# Using TensorFlow Serving via Docker
1+
# Using TensorFlow Serving with Docker
22

3-
One of the easiest ways to get started using TensorFlow Serving is via
3+
One of the easiest ways to get started using TensorFlow Serving is with
44
[Docker](http://www.docker.com/).
55

66
## Installing Docker
@@ -136,8 +136,8 @@ deploy and will load your model for serving on startup.
136136

137137
### Serving example
138138

139-
Let's run through a full example where we load a SavedModel and call it via the
140-
REST API. First pull the serving image:
139+
Let's run through a full example where we load a SavedModel and call it using
140+
the REST API. First pull the serving image:
141141

142142
```shell
143143
docker pull tensorflow/serving
@@ -209,7 +209,7 @@ details, see [running a serving image](#running-a-serving-image).
209209
### GPU Serving example
210210

211211
Let's run through a full example where we load a model with GPU-bound ops and
212-
call it via the REST API.
212+
call it using the REST API.
213213

214214
First install [`nvidia-docker`](#install-nvidia-docker). Next you can pull the
215215
latest TensorFlow Serving GPU docker image by running:

tensorflow_serving/g3doc/overview.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,7 @@ TensorFlow Serving Managers provide a simple, narrow interface --
121121

122122
### Core
123123

124-
**TensorFlow Serving Core** manages (via standard TensorFlow Serving APIs) the
124+
Using the standard TensorFlow Serving APis, *TensorFlow Serving Core* manages the
125125
following aspects of servables:
126126

127127
* lifecycle

tensorflow_serving/g3doc/setup.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
### Installing using Docker
66

7-
The easiest and most straight-forward way of using TensorFlow Serving is via
7+
The easiest and most straight-forward way of using TensorFlow Serving is with
88
[Docker images](docker.md). We highly recommend this route unless you have
99
specific needs that are not addressed by running in a container.
1010

0 commit comments

Comments
 (0)