diff --git a/docs/my-website/docs/proxy_api.md b/docs/my-website/docs/proxy_api.md index 4535c8f3db..3f9fdbf631 100644 --- a/docs/my-website/docs/proxy_api.md +++ b/docs/my-website/docs/proxy_api.md @@ -55,19 +55,21 @@ print(response) **Note**: You will need to clone and modify the Github repo, until [this PR is merged.](https://github.com/KillianLucas/open-interpreter/pull/288) -In `interpreter.py` set, -```python -os.environ["OPENAI_API_KEY"] = "openinterpreter-key" -litellm.api_base = "https://proxy.litellm.ai" ``` +git clone https://github.com/krrishdholakia/open-interpreter-litellm-fork +``` +To run it do: +``` +poetry build -and change the model on [this line](https://github.com/KillianLucas/open-interpreter/blob/f803d0d7a545edabd541943145a2a60beaf604e4/interpreter/interpreter.py#L342C10-L342C10), to: -```python -# call gpt-4 -self.model = "openai/gpt-4" # 👈 always add 'openai/' in front of the model name +# call gpt-4 - always add 'litellm_proxy/' in front of the model name +poetry run interpreter --model litellm_proxy/gpt-4 -# call togethercomputer/llama-2-70b-chat -self.model = "openai/togethercomputer/llama-2-70b-chat" # 👈 always add 'openai/' in front of the model name +# call llama-70b - always add 'litellm_proxy/' in front of the model name +poetry run interpreter --model litellm_proxy/togethercomputer/llama-2-70b-chat + +# call claude-2 - always add 'litellm_proxy/' in front of the model name +poetry run interpreter --model litellm_proxy/claude-2 ``` And that's it!