Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve docker image UX and fix MNIST serving example #3382

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docker/config.properties
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ management_address=http://0.0.0.0:8081
metrics_address=http://0.0.0.0:8082
grpc_inference_address=0.0.0.0
grpc_management_address=0.0.0.0
enable_envvars_config=true
number_of_netty_threads=32
job_queue_size=1000
model_store=/home/model-server/model-store
Expand Down
8 changes: 4 additions & 4 deletions docker/dockerd-entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,10 @@ set -e

if [[ "$1" = "serve" ]]; then
shift 1
torchserve --start --ts-config /home/model-server/config.properties --disable-token-auth
torchserve --foreground --ts-config /home/model-server/config.properties --disable-token-auth "$@"
else
eval "$@"
fi

# prevent docker exit
tail -f /dev/null
# prevent docker exit
tail -f /dev/null
fi
10 changes: 9 additions & 1 deletion examples/image_classifier/mnist/Docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Run the commands given in following steps from the parent directory of the root
### Start a docker container with torchserve

```bash
docker run --rm -it -p 127.0.0.1:8080:8080 -p 127.0.0.1:8081:8081 -p 127.0.0.1:8082:8082 -v $(pwd)/model_store:/home/model-server/model-store pytorch/torchserve:latest-cpu
docker run --rm -it -p 127.0.0.1:8080:8080 -p 127.0.0.1:8081:8081 -p 127.0.0.1:8082:8082 -e TS_ENABLE_MODEL_API=true -v $(pwd)/model_store:/home/model-server/model-store pytorch/torchserve:latest-cpu
```

### Register the model on TorchServe using the above model archive file
Expand All @@ -45,6 +45,14 @@ Run the commands given in following steps from the parent directory of the root
}
```

An alternative to manual registration of models is to specify the model names TorchServe should register at startup using the [`load_models`](https://pytorch.org/serve/configuration.html#load-models-at-startup) property. The property can be configured by setting the `TS_LOAD_MODELS=mnist.mar` environment variable (this removes the need for the `TS_ENABLE_MODEL_API` environment variable and the `curl` call above):

```bash
docker run --rm -it -p 127.0.0.1:8080:8080 -p 127.0.0.1:8081:8081 -p 127.0.0.1:8082:8082 -e TS_LOAD_MODELS=mnist.mar -v $(pwd)/model_store:/home/model-server/model-store pytorch/torchserve:latest-cpu
```

Note that this approach does not allow specifying the initial number of workers.

### Run digit recognition inference outside the container

```bash
Expand Down