mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
new v litellm for render
This commit is contained in:
parent
25bd80a5aa
commit
5d0f9fd749
4 changed files with 16 additions and 10 deletions
BIN
.DS_Store
vendored
BIN
.DS_Store
vendored
Binary file not shown.
|
@ -73,14 +73,7 @@ workflows:
|
|||
version: 2
|
||||
build_and_test:
|
||||
jobs:
|
||||
- local_testing:
|
||||
filters:
|
||||
paths:
|
||||
ignore:
|
||||
- "README.md"
|
||||
- "docs"
|
||||
- "cookbook"
|
||||
|
||||
- local_testing
|
||||
- publish_to_pypi:
|
||||
requires:
|
||||
- local_testing
|
||||
|
|
|
@ -1,17 +1,26 @@
|
|||
<<<<<<< HEAD
|
||||
# Proxy Server for Azure, Llama2, OpenAI, Claude, Hugging Face, Replicate Models
|
||||
[](https://pypi.org/project/litellm/)
|
||||
[](https://pypi.org/project/litellm/0.1.1/)
|
||||

|
||||
[](https://github.com/BerriAI/litellm)
|
||||
=======
|
||||
# Proxy Server for Chat API
|
||||
>>>>>>> d1ff082 (new v litellm for render)
|
||||
|
||||
[](https://discord.gg/wuPM9dRgDw)
|
||||
This repository contains a proxy server that interacts with OpenAI's Chat API and other similar APIs to facilitate chat-based language models. The server allows you to easily integrate chat completion capabilities into your applications. The server is built using Python and the Flask framework.
|
||||
|
||||
<<<<<<< HEAD
|
||||
# Proxy Server for Chat API
|
||||
|
||||
This repository contains a proxy server that interacts with OpenAI's Chat API and other similar APIs to facilitate chat-based language models. The server allows you to easily integrate chat completion capabilities into your applications. The server is built using Python and the Flask framework.
|
||||
|
||||
## Installation
|
||||
|
||||
=======
|
||||
## Installation
|
||||
|
||||
>>>>>>> d1ff082 (new v litellm for render)
|
||||
To set up and run the proxy server locally, follow these steps:
|
||||
|
||||
1. Clone this repository to your local machine:
|
||||
|
@ -90,4 +99,8 @@ google/palm-2-chat-bison
|
|||
Vertex Models:
|
||||
chat-bison
|
||||
chat-bison@001
|
||||
<<<<<<< HEAD
|
||||
Refer to the model endpoint compatibility table for more details.
|
||||
=======
|
||||
Refer to the model endpoint compatibility table for more details.
|
||||
>>>>>>> d1ff082 (new v litellm for render)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
flask
|
||||
flask_cors
|
||||
waitress
|
||||
litellm
|
||||
litellm==0.1.381
|
Loading…
Add table
Add a link
Reference in a new issue