forked from phoenix/litellm-mirror
docs(deploy.md): add community terraform module
This commit is contained in:
parent
285f35cf49
commit
c66c06d3d1
1 changed files with 8 additions and 2 deletions
|
@ -27,7 +27,7 @@ docker-compose up
|
||||||
|
|
||||||
<Tabs>
|
<Tabs>
|
||||||
|
|
||||||
<TabItem value="basic" label="Basic">
|
<TabItem value="basic" label="Basic (No DB)">
|
||||||
|
|
||||||
### Step 1. CREATE config.yaml
|
### Step 1. CREATE config.yaml
|
||||||
|
|
||||||
|
@ -98,7 +98,13 @@ docker run ghcr.io/berriai/litellm:main-latest --port 8002 --num_workers 8
|
||||||
```
|
```
|
||||||
|
|
||||||
</TabItem>
|
</TabItem>
|
||||||
|
<TabItem value="terraform" label="Terraform">
|
||||||
|
|
||||||
|
s/o [Nicholas Cecere](https://www.linkedin.com/in/nicholas-cecere-24243549/) for his LiteLLM User Management Terraform
|
||||||
|
|
||||||
|
👉 [Go here for Terraform](https://github.com/ncecere/terraform-litellm-user-mgmt)
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
<TabItem value="base-image" label="use litellm as a base image">
|
<TabItem value="base-image" label="use litellm as a base image">
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
|
@ -380,6 +386,7 @@ kubectl port-forward service/litellm-service 4000:4000
|
||||||
Your OpenAI proxy server is now running on `http://0.0.0.0:4000`.
|
Your OpenAI proxy server is now running on `http://0.0.0.0:4000`.
|
||||||
|
|
||||||
</TabItem>
|
</TabItem>
|
||||||
|
|
||||||
<TabItem value="helm-deploy" label="Helm">
|
<TabItem value="helm-deploy" label="Helm">
|
||||||
|
|
||||||
|
|
||||||
|
@ -425,7 +432,6 @@ If you need to set your litellm proxy config.yaml, you can find this in [values.
|
||||||
|
|
||||||
</TabItem>
|
</TabItem>
|
||||||
|
|
||||||
|
|
||||||
<TabItem value="helm-oci" label="Helm OCI Registry (GHCR)">
|
<TabItem value="helm-oci" label="Helm OCI Registry (GHCR)">
|
||||||
|
|
||||||
:::info
|
:::info
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue