From 43ecfc868cead6eb28df2fe789663299afd49b83 Mon Sep 17 00:00:00 2001 From: ishaan-jaff Date: Sat, 9 Sep 2023 14:16:26 -0700 Subject: [PATCH] update ollama docs --- docs/my-website/docs/providers/ollama.md | 34 ++++++++++++++++++++++++ 1 file changed, 34 insertions(+) diff --git a/docs/my-website/docs/providers/ollama.md b/docs/my-website/docs/providers/ollama.md index 8c1fd5ebd..f9d27c8ed 100644 --- a/docs/my-website/docs/providers/ollama.md +++ b/docs/my-website/docs/providers/ollama.md @@ -1,6 +1,40 @@ # Ollama LiteLLM supports all models from [Ollama](https://github.com/jmorganca/ollama) +## Pre-requisites +Ensure you have your ollama server running + +## Example usage +```python +from litellm import completion + +response = completion( + model="llama2", + messages=[{ "content": "respond in 20 words. who are you?","role": "user"}], + api_base="http://localhost:11434", + custom_llm_provider="ollama" +) +print(response) + +``` + +## Example usage - Streaming +```python +from litellm import completion + +response = completion( + model="llama2", + messages=[{ "content": "respond in 20 words. who are you?","role": "user"}], + api_base="http://localhost:11434", + custom_llm_provider="ollama", + stream=True +) +print(response) +for chunk in response: + print(chunk['choices'][0]['delta']) + +``` + ### Ollama Models Ollama supported models: https://github.com/jmorganca/ollama