diff --git a/openai-proxy/README.md b/openai-proxy/README.md index 84270fced..b243ada0a 100644 --- a/openai-proxy/README.md +++ b/openai-proxy/README.md @@ -2,10 +2,6 @@ A simple, fast, and lightweight **OpenAI-compatible server** to call 100+ LLM APIs. -## Endpoints: -- `/chat/completions` - chat completions endpoint to call 100+ LLMs -- `/models` - available models on server -

@@ -28,6 +24,10 @@ $ cd ./litellm/openai-proxy $ uvicorn main:app --host 0.0.0.0 --port 8000 ``` +## Endpoints: +- `/chat/completions` - chat completions endpoint to call 100+ LLMs +- `/models` - available models on server + ## replace openai base ```python import openai