mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-26 03:04:13 +00:00
docs(proxy_server.md): add docker image details to docs
This commit is contained in:
parent
24ea5e00b7
commit
ca7e2f6a05
2 changed files with 21 additions and 9 deletions
|
@ -240,7 +240,7 @@ task = Task(agent, name="my-llm-task")
|
||||||
task.run()
|
task.run()
|
||||||
```
|
```
|
||||||
|
|
||||||
Credits [@pchalasani](https://github.com/pchalasani) for this tutorial.
|
Credits [@pchalasani](https://github.com/pchalasani) and [Langroid](https://github.com/langroid/langroid) for this tutorial.
|
||||||
</TabItem>
|
</TabItem>
|
||||||
</Tabs>
|
</Tabs>
|
||||||
|
|
||||||
|
@ -323,6 +323,18 @@ This will return your logs from `~/.ollama/logs/server.log`.
|
||||||
### Deploy Proxy
|
### Deploy Proxy
|
||||||
|
|
||||||
<Tabs>
|
<Tabs>
|
||||||
|
<TabItem value="docker" label="Ollama/OpenAI Docker">
|
||||||
|
Use this to deploy local models with Ollama that's OpenAI-compatible.
|
||||||
|
|
||||||
|
It works for models like Mistral, Llama2, CodeLlama, etc. (any model supported by [Ollama](https://ollama.ai/library))
|
||||||
|
|
||||||
|
**usage**
|
||||||
|
```shell
|
||||||
|
docker run --name ollama litellm/ollama
|
||||||
|
```
|
||||||
|
|
||||||
|
More details 👉 https://hub.docker.com/r/litellm/ollama
|
||||||
|
</TabItem>
|
||||||
<TabItem value="self-hosted" label="Self-Hosted">
|
<TabItem value="self-hosted" label="Self-Hosted">
|
||||||
|
|
||||||
**Step 1: Clone the repo**
|
**Step 1: Clone the repo**
|
||||||
|
|
|
@ -7,14 +7,14 @@ load_dotenv()
|
||||||
from importlib import resources
|
from importlib import resources
|
||||||
import shutil, random
|
import shutil, random
|
||||||
list_of_messages = [
|
list_of_messages = [
|
||||||
"The thing I wish you improved is...:",
|
"'The thing I wish you improved is...'",
|
||||||
"A feature I really want is...:",
|
"'A feature I really want is...'",
|
||||||
"The worst thing about this product is...:",
|
"'The worst thing about this product is...'",
|
||||||
"This product would be better if...:",
|
"'This product would be better if...'",
|
||||||
"I don't like how this works...:",
|
"'I don't like how this works...'",
|
||||||
"It would help me if you could add...:",
|
"'It would help me if you could add...'",
|
||||||
"This feature doesn't meet my needs because...:",
|
"'This feature doesn't meet my needs because...'",
|
||||||
"I get frustrated when the product...:",
|
"'I get frustrated when the product...'",
|
||||||
]
|
]
|
||||||
|
|
||||||
def generate_feedback_box():
|
def generate_feedback_box():
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue