diff --git a/docs/my-website/docs/proxy/quick_start.md b/docs/my-website/docs/proxy/quick_start.md index f6513e2b8..4f508ee59 100644 --- a/docs/my-website/docs/proxy/quick_start.md +++ b/docs/my-website/docs/proxy/quick_start.md @@ -370,12 +370,12 @@ See the latest available ghcr docker image here: https://github.com/berriai/litellm/pkgs/container/litellm ```shell -docker pull ghcr.io/berriai/litellm:main-v1.16.13 +docker pull ghcr.io/berriai/litellm:main-latest ``` ### Run the Docker Image ```shell -docker run ghcr.io/berriai/litellm:main-v1.16.13 +docker run ghcr.io/berriai/litellm:main-latest ``` #### Run the Docker Image with LiteLLM CLI args @@ -384,12 +384,12 @@ See all supported CLI args [here](https://docs.litellm.ai/docs/proxy/cli): Here's how you can run the docker image and pass your config to `litellm` ```shell -docker run ghcr.io/berriai/litellm:main-v1.16.13 --config your_config.yaml +docker run ghcr.io/berriai/litellm:main-latest --config your_config.yaml ``` Here's how you can run the docker image and start litellm on port 8002 with `num_workers=8` ```shell -docker run ghcr.io/berriai/litellm:main-v1.16.13 --port 8002 --num_workers 8 +docker run ghcr.io/berriai/litellm:main-latest --port 8002 --num_workers 8 ``` #### Run the Docker Image using docker compose