forked from phoenix/litellm-mirror
* fix(pattern_match_deployments.py): default to user input if unable to map based on wildcards * test: fix test * test: reset test name * test: update conftest to reload proxy server module between tests * ci(config.yml): move langfuse out of local_testing reduce ci/cd time * ci(config.yml): cleanup langfuse ci/cd tests * fix: update test to not use global proxy_server app module * ci: move caching to a separate test pipeline speed up ci pipeline * test: update conftest to check if proxy_server attr exists before reloading * build(conftest.py): don't block on inability to reload proxy_server * ci(config.yml): update caching unit test filter to work on 'cache' keyword as well * fix(encrypt_decrypt_utils.py): use function to get salt key * test: mark flaky test * test: handle anthropic overloaded errors * refactor: create separate ci/cd pipeline for proxy unit tests make ci/cd faster * ci(config.yml): add litellm_proxy_unit_testing to build_and_test jobs * ci(config.yml): generate prisma binaries for proxy unit tests * test: readd vertex_key.json * ci(config.yml): remove `-s` from proxy_unit_test cmd speed up test * ci: remove any 'debug' logging flag speed up ci pipeline * test: fix test * test(test_braintrust.py): rerun * test: add delay for braintrust test
28 lines
No EOL
886 B
YAML
28 lines
No EOL
886 B
YAML
litellm_settings:
|
|
drop_params: True
|
|
|
|
# Model-specific settings
|
|
model_list: # use the same model_name for using the litellm router. LiteLLM will use the router between gpt-3.5-turbo
|
|
- model_name: gpt-3.5-turbo # litellm will
|
|
litellm_params:
|
|
model: gpt-3.5-turbo
|
|
api_key: sk-uj6F
|
|
tpm: 20000 # [OPTIONAL] REPLACE with your openai tpm
|
|
rpm: 3 # [OPTIONAL] REPLACE with your openai rpm
|
|
- model_name: gpt-3.5-turbo
|
|
litellm_params:
|
|
model: gpt-3.5-turbo
|
|
api_key: sk-Imn
|
|
tpm: 20000 # [OPTIONAL] REPLACE with your openai tpm
|
|
rpm: 3 # [OPTIONAL] REPLACE with your openai rpm
|
|
- model_name: gpt-3.5-turbo
|
|
litellm_params:
|
|
model: openrouter/gpt-3.5-turbo
|
|
- model_name: mistral-7b-instruct
|
|
litellm_params:
|
|
model: mistralai/mistral-7b-instruct
|
|
|
|
environment_variables:
|
|
REDIS_HOST: localhost
|
|
REDIS_PASSWORD:
|
|
REDIS_PORT: |