forked from phoenix/litellm-mirror
Merge pull request #2019 from BerriAI/litellm_use_litellm_ghcr_as_base_image
[Docs] Show how to use litellm GHCR as a base image
This commit is contained in:
commit
b91f1786c0
3 changed files with 55 additions and 2 deletions
17
deploy/Dockerfile.ghcr_base
Normal file
17
deploy/Dockerfile.ghcr_base
Normal file
|
@ -0,0 +1,17 @@
|
||||||
|
# Use the provided base image
|
||||||
|
FROM ghcr.io/berriai/litellm:main-latest
|
||||||
|
|
||||||
|
# Set the working directory to /app
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy the configuration file into the container at /app
|
||||||
|
COPY config.yaml .
|
||||||
|
|
||||||
|
# Make sure your entrypoint.sh is executable
|
||||||
|
RUN chmod +x entrypoint.sh
|
||||||
|
|
||||||
|
# Expose the necessary port
|
||||||
|
EXPOSE 4000/tcp
|
||||||
|
|
||||||
|
# Override the CMD instruction with your desired command and arguments
|
||||||
|
CMD ["--port", "4000", "--config", "config.yaml", "--detailed_debug", "--run_gunicorn"]
|
|
@ -8,8 +8,6 @@ services:
|
||||||
- "4000:4000"
|
- "4000:4000"
|
||||||
environment:
|
environment:
|
||||||
- AZURE_API_KEY=sk-123
|
- AZURE_API_KEY=sk-123
|
||||||
litellm-ui:
|
|
||||||
image: ghcr.io/berriai/litellm-ui:main-latest
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -7,6 +7,10 @@ You can find the Dockerfile to build litellm proxy [here](https://github.com/Ber
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
|
<Tabs>
|
||||||
|
|
||||||
|
<TabItem value="basic" label="Basic">
|
||||||
|
|
||||||
See the latest available ghcr docker image here:
|
See the latest available ghcr docker image here:
|
||||||
https://github.com/berriai/litellm/pkgs/container/litellm
|
https://github.com/berriai/litellm/pkgs/container/litellm
|
||||||
|
|
||||||
|
@ -18,6 +22,12 @@ docker pull ghcr.io/berriai/litellm:main-latest
|
||||||
docker run ghcr.io/berriai/litellm:main-latest
|
docker run ghcr.io/berriai/litellm:main-latest
|
||||||
```
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
<TabItem value="cli" label="With CLI Args">
|
||||||
|
|
||||||
### Run with LiteLLM CLI args
|
### Run with LiteLLM CLI args
|
||||||
|
|
||||||
See all supported CLI args [here](https://docs.litellm.ai/docs/proxy/cli):
|
See all supported CLI args [here](https://docs.litellm.ai/docs/proxy/cli):
|
||||||
|
@ -32,6 +42,34 @@ Here's how you can run the docker image and start litellm on port 8002 with `num
|
||||||
docker run ghcr.io/berriai/litellm:main-latest --port 8002 --num_workers 8
|
docker run ghcr.io/berriai/litellm:main-latest --port 8002 --num_workers 8
|
||||||
```
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
|
||||||
|
<TabItem value="base-image" label="use litellm as a base image">
|
||||||
|
|
||||||
|
```shell
|
||||||
|
# Use the provided base image
|
||||||
|
FROM ghcr.io/berriai/litellm:main-latest
|
||||||
|
|
||||||
|
# Set the working directory to /app
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy the configuration file into the container at /app
|
||||||
|
COPY config.yaml .
|
||||||
|
|
||||||
|
# Make sure your entrypoint.sh is executable
|
||||||
|
RUN chmod +x entrypoint.sh
|
||||||
|
|
||||||
|
# Expose the necessary port
|
||||||
|
EXPOSE 4000/tcp
|
||||||
|
|
||||||
|
# Override the CMD instruction with your desired command and arguments
|
||||||
|
CMD ["--port", "4000", "--config", "config.yaml", "--detailed_debug", "--run_gunicorn"]
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
|
||||||
|
</Tabs>
|
||||||
|
|
||||||
## Deploy with Database
|
## Deploy with Database
|
||||||
|
|
||||||
We maintain a [seperate Dockerfile](https://github.com/BerriAI/litellm/pkgs/container/litellm-database) for reducing build time when running LiteLLM proxy with a connected Postgres Database
|
We maintain a [seperate Dockerfile](https://github.com/BerriAI/litellm/pkgs/container/litellm-database) for reducing build time when running LiteLLM proxy with a connected Postgres Database
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue