diff --git a/docs/my-website/docs/proxy/deploy.md b/docs/my-website/docs/proxy/deploy.md index 8767417f5..47d089ab4 100644 --- a/docs/my-website/docs/proxy/deploy.md +++ b/docs/my-website/docs/proxy/deploy.md @@ -27,7 +27,7 @@ docker-compose up - + ### Step 1. CREATE config.yaml @@ -98,7 +98,13 @@ docker run ghcr.io/berriai/litellm:main-latest --port 8002 --num_workers 8 ``` + +s/o [Nicholas Cecere](https://www.linkedin.com/in/nicholas-cecere-24243549/) for his LiteLLM User Management Terraform + +👉 [Go here for Terraform](https://github.com/ncecere/terraform-litellm-user-mgmt) + + ```shell @@ -380,6 +386,7 @@ kubectl port-forward service/litellm-service 4000:4000 Your OpenAI proxy server is now running on `http://0.0.0.0:4000`. + @@ -425,7 +432,6 @@ If you need to set your litellm proxy config.yaml, you can find this in [values. - :::info