mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-26 11:14:04 +00:00
docs(proxy_server.md): add docker image details to docs
This commit is contained in:
parent
b98b813c42
commit
3cc23048d5
2 changed files with 21 additions and 9 deletions
|
@ -240,7 +240,7 @@ task = Task(agent, name="my-llm-task")
|
|||
task.run()
|
||||
```
|
||||
|
||||
Credits [@pchalasani](https://github.com/pchalasani) for this tutorial.
|
||||
Credits [@pchalasani](https://github.com/pchalasani) and [Langroid](https://github.com/langroid/langroid) for this tutorial.
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
|
@ -323,6 +323,18 @@ This will return your logs from `~/.ollama/logs/server.log`.
|
|||
### Deploy Proxy
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="docker" label="Ollama/OpenAI Docker">
|
||||
Use this to deploy local models with Ollama that's OpenAI-compatible.
|
||||
|
||||
It works for models like Mistral, Llama2, CodeLlama, etc. (any model supported by [Ollama](https://ollama.ai/library))
|
||||
|
||||
**usage**
|
||||
```shell
|
||||
docker run --name ollama litellm/ollama
|
||||
```
|
||||
|
||||
More details 👉 https://hub.docker.com/r/litellm/ollama
|
||||
</TabItem>
|
||||
<TabItem value="self-hosted" label="Self-Hosted">
|
||||
|
||||
**Step 1: Clone the repo**
|
||||
|
|
|
@ -7,14 +7,14 @@ load_dotenv()
|
|||
from importlib import resources
|
||||
import shutil, random
|
||||
list_of_messages = [
|
||||
"The thing I wish you improved is...:",
|
||||
"A feature I really want is...:",
|
||||
"The worst thing about this product is...:",
|
||||
"This product would be better if...:",
|
||||
"I don't like how this works...:",
|
||||
"It would help me if you could add...:",
|
||||
"This feature doesn't meet my needs because...:",
|
||||
"I get frustrated when the product...:",
|
||||
"'The thing I wish you improved is...'",
|
||||
"'A feature I really want is...'",
|
||||
"'The worst thing about this product is...'",
|
||||
"'This product would be better if...'",
|
||||
"'I don't like how this works...'",
|
||||
"'It would help me if you could add...'",
|
||||
"'This feature doesn't meet my needs because...'",
|
||||
"'I get frustrated when the product...'",
|
||||
]
|
||||
|
||||
def generate_feedback_box():
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue