diff --git a/deploy/charts/litellm/README.md b/deploy/charts/litellm/README.md index 817781ed0..e7dbe5f82 100644 --- a/deploy/charts/litellm/README.md +++ b/deploy/charts/litellm/README.md @@ -2,7 +2,7 @@ ## Prerequisites -- Kubernetes 1.23+ +- Kubernetes 1.21+ - Helm 3.8.0+ If `db.deployStandalone` is used: diff --git a/docs/my-website/docs/proxy/deploy.md b/docs/my-website/docs/proxy/deploy.md index 175806d27..d8bbe43ae 100644 --- a/docs/my-website/docs/proxy/deploy.md +++ b/docs/my-website/docs/proxy/deploy.md @@ -150,17 +150,20 @@ To avoid issues with predictability, difficulties in rollback, and inconsistent ## Deploy with Database +### Docker, Kubernetes, Helm Chart + + + + + We maintain a [seperate Dockerfile](https://github.com/BerriAI/litellm/pkgs/container/litellm-database) for reducing build time when running LiteLLM proxy with a connected Postgres Database - - - -``` +```shell docker pull docker pull ghcr.io/berriai/litellm-database:main-latest ``` -``` +```shell docker run --name litellm-proxy \ -e DATABASE_URL=postgresql://:@:/ \ -p 4000:4000 \ @@ -233,6 +236,8 @@ Your OpenAI proxy server is now running on `http://0.0.0.0:4000`. +Use this to deploy litellm using a helm chart. Link to [the LiteLLM Helm Chart](https://github.com/BerriAI/litellm/tree/main/deploy/charts/litellm) + #### Step 1. Clone the repository ```bash @@ -241,6 +246,8 @@ git clone https://github.com/BerriAI/litellm.git #### Step 2. Deploy with Helm +Run the following command in the root of your `litellm` repo + ```bash helm install \ --set masterkey=SuPeRsEcReT \