forked from phoenix/litellm-mirror
(docs) use litellm ghcr as base image
This commit is contained in:
parent
8d3897592e
commit
6e1708f595
2 changed files with 55 additions and 0 deletions
17
deploy/Dockerfile.ghcr_base
Normal file
17
deploy/Dockerfile.ghcr_base
Normal file
|
@ -0,0 +1,17 @@
|
|||
# Use the provided base image
|
||||
FROM ghcr.io/berriai/litellm:main-latest
|
||||
|
||||
# Set the working directory to /app
|
||||
WORKDIR /app
|
||||
|
||||
# Copy the configuration file into the container at /app
|
||||
COPY config.yaml .
|
||||
|
||||
# Make sure your entrypoint.sh is executable
|
||||
RUN chmod +x entrypoint.sh
|
||||
|
||||
# Expose the necessary port
|
||||
EXPOSE 4000/tcp
|
||||
|
||||
# Override the CMD instruction with your desired command and arguments
|
||||
CMD ["--port", "4000", "--config", "config.yaml", "--detailed_debug", "--run_gunicorn"]
|
|
@ -7,6 +7,10 @@ You can find the Dockerfile to build litellm proxy [here](https://github.com/Ber
|
|||
|
||||
## Quick Start
|
||||
|
||||
<Tabs>
|
||||
|
||||
<TabItem value="basic" label="Basic">
|
||||
|
||||
See the latest available ghcr docker image here:
|
||||
https://github.com/berriai/litellm/pkgs/container/litellm
|
||||
|
||||
|
@ -18,6 +22,12 @@ docker pull ghcr.io/berriai/litellm:main-latest
|
|||
docker run ghcr.io/berriai/litellm:main-latest
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
||||
|
||||
|
||||
<TabItem value="cli" label="With CLI Args">
|
||||
|
||||
### Run with LiteLLM CLI args
|
||||
|
||||
See all supported CLI args [here](https://docs.litellm.ai/docs/proxy/cli):
|
||||
|
@ -32,6 +42,34 @@ Here's how you can run the docker image and start litellm on port 8002 with `num
|
|||
docker run ghcr.io/berriai/litellm:main-latest --port 8002 --num_workers 8
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
||||
<TabItem value="base-image" label="use litellm as a base image">
|
||||
|
||||
```shell
|
||||
# Use the provided base image
|
||||
FROM ghcr.io/berriai/litellm:main-latest
|
||||
|
||||
# Set the working directory to /app
|
||||
WORKDIR /app
|
||||
|
||||
# Copy the configuration file into the container at /app
|
||||
COPY config.yaml .
|
||||
|
||||
# Make sure your entrypoint.sh is executable
|
||||
RUN chmod +x entrypoint.sh
|
||||
|
||||
# Expose the necessary port
|
||||
EXPOSE 4000/tcp
|
||||
|
||||
# Override the CMD instruction with your desired command and arguments
|
||||
CMD ["--port", "4000", "--config", "config.yaml", "--detailed_debug", "--run_gunicorn"]
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
||||
</Tabs>
|
||||
|
||||
## Deploy with Database
|
||||
|
||||
We maintain a [seperate Dockerfile](https://github.com/BerriAI/litellm/pkgs/container/litellm-database) for reducing build time when running LiteLLM proxy with a connected Postgres Database
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue