forked from phoenix/litellm-mirror
docs(openai-proxy-readme): add docker package to readme
This commit is contained in:
parent
82d6b1c55f
commit
c99fa955f4
2 changed files with 6 additions and 1 deletions
|
@ -13,6 +13,11 @@ A simple, fast, and lightweight **OpenAI-compatible server** to call 100+ LLM AP
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
|
```shell
|
||||||
|
docker run -e PORT=8000 -p 8000:8000 ghcr.io/berriai/litellm:latest
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running Locally
|
||||||
```shell
|
```shell
|
||||||
$ git clone https://github.com/BerriAI/litellm.git
|
$ git clone https://github.com/BerriAI/litellm.git
|
||||||
```
|
```
|
||||||
|
|
|
@ -3,4 +3,4 @@ fastapi
|
||||||
uvicorn
|
uvicorn
|
||||||
boto3
|
boto3
|
||||||
litellm
|
litellm
|
||||||
dotenv
|
python-dotenv
|
Loading…
Add table
Add a link
Reference in a new issue