forked from phoenix/litellm-mirror
docs(proxy_server.md): add logs, save keys, model fallbacks, config file template to proxy server docs
This commit is contained in:
parent
342925814b
commit
fc757dc1b4
3 changed files with 194 additions and 51 deletions
|
@ -3,6 +3,7 @@ import Image from '@theme/IdealImage';
|
|||
# Customize Prompt Templates on OpenAI-Compatible server
|
||||
|
||||
**You will learn:** How to set a custom prompt template on our OpenAI compatible server.
|
||||
**How?** We will modify the prompt template for CodeLlama
|
||||
|
||||
## Step 1: Start OpenAI Compatible server
|
||||
Let's spin up a local OpenAI-compatible server, to call a deployed `codellama/CodeLlama-34b-Instruct-hf` model using Huggingface's [Text-Generation-Inference (TGI)](https://github.com/huggingface/text-generation-inference) format.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue