forked from phoenix/litellm-mirror
new v litellm for render
This commit is contained in:
parent
25bd80a5aa
commit
5d0f9fd749
4 changed files with 16 additions and 10 deletions
BIN
.DS_Store
vendored
BIN
.DS_Store
vendored
Binary file not shown.
|
@ -73,14 +73,7 @@ workflows:
|
||||||
version: 2
|
version: 2
|
||||||
build_and_test:
|
build_and_test:
|
||||||
jobs:
|
jobs:
|
||||||
- local_testing:
|
- local_testing
|
||||||
filters:
|
|
||||||
paths:
|
|
||||||
ignore:
|
|
||||||
- "README.md"
|
|
||||||
- "docs"
|
|
||||||
- "cookbook"
|
|
||||||
|
|
||||||
- publish_to_pypi:
|
- publish_to_pypi:
|
||||||
requires:
|
requires:
|
||||||
- local_testing
|
- local_testing
|
||||||
|
|
|
@ -1,17 +1,26 @@
|
||||||
|
<<<<<<< HEAD
|
||||||
# Proxy Server for Azure, Llama2, OpenAI, Claude, Hugging Face, Replicate Models
|
# Proxy Server for Azure, Llama2, OpenAI, Claude, Hugging Face, Replicate Models
|
||||||
[](https://pypi.org/project/litellm/)
|
[](https://pypi.org/project/litellm/)
|
||||||
[](https://pypi.org/project/litellm/0.1.1/)
|
[](https://pypi.org/project/litellm/0.1.1/)
|
||||||

|

|
||||||
[](https://github.com/BerriAI/litellm)
|
[](https://github.com/BerriAI/litellm)
|
||||||
|
=======
|
||||||
|
# Proxy Server for Chat API
|
||||||
|
>>>>>>> d1ff082 (new v litellm for render)
|
||||||
|
|
||||||
[](https://discord.gg/wuPM9dRgDw)
|
This repository contains a proxy server that interacts with OpenAI's Chat API and other similar APIs to facilitate chat-based language models. The server allows you to easily integrate chat completion capabilities into your applications. The server is built using Python and the Flask framework.
|
||||||
|
|
||||||
|
<<<<<<< HEAD
|
||||||
# Proxy Server for Chat API
|
# Proxy Server for Chat API
|
||||||
|
|
||||||
This repository contains a proxy server that interacts with OpenAI's Chat API and other similar APIs to facilitate chat-based language models. The server allows you to easily integrate chat completion capabilities into your applications. The server is built using Python and the Flask framework.
|
This repository contains a proxy server that interacts with OpenAI's Chat API and other similar APIs to facilitate chat-based language models. The server allows you to easily integrate chat completion capabilities into your applications. The server is built using Python and the Flask framework.
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
|
=======
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
>>>>>>> d1ff082 (new v litellm for render)
|
||||||
To set up and run the proxy server locally, follow these steps:
|
To set up and run the proxy server locally, follow these steps:
|
||||||
|
|
||||||
1. Clone this repository to your local machine:
|
1. Clone this repository to your local machine:
|
||||||
|
@ -90,4 +99,8 @@ google/palm-2-chat-bison
|
||||||
Vertex Models:
|
Vertex Models:
|
||||||
chat-bison
|
chat-bison
|
||||||
chat-bison@001
|
chat-bison@001
|
||||||
|
<<<<<<< HEAD
|
||||||
Refer to the model endpoint compatibility table for more details.
|
Refer to the model endpoint compatibility table for more details.
|
||||||
|
=======
|
||||||
|
Refer to the model endpoint compatibility table for more details.
|
||||||
|
>>>>>>> d1ff082 (new v litellm for render)
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
flask
|
flask
|
||||||
flask_cors
|
flask_cors
|
||||||
waitress
|
waitress
|
||||||
litellm
|
litellm==0.1.381
|
Loading…
Add table
Add a link
Reference in a new issue