..
_experimental
ui new build
2025-03-14 21:46:40 -07:00
analytics_endpoints
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
anthropic_endpoints
get_custom_headers
2025-03-12 18:57:41 -07:00
auth
feat(proxy_server.py): support retrieving models for a team, if user is a member - via /models?team_id
2025-03-14 19:34:06 -07:00
batches_endpoints
use correct get custom headers
2025-03-12 17:16:51 -07:00
common_utils
feat(internal_user_endpoints.py): emit audit log on /user/new
event
2025-03-13 16:47:58 -07:00
config_management_endpoints
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
credential_endpoints
fix: fix linting error
2025-03-14 14:17:28 -07:00
db
(Feat) - Show Error Logs on LiteLLM UI ( #8904 )
2025-02-28 20:10:09 -08:00
example_config_yaml
fix async_moderation_hook
2025-03-12 18:45:54 -07:00
fine_tuning_endpoints
use correct get custom headers
2025-03-12 17:16:51 -07:00
guardrails
fix: fix linting errors
2025-03-13 19:40:18 -07:00
health_endpoints
fix code quality
2025-03-14 21:06:28 -07:00
hooks
fix ProxyUpdateSpend
2025-03-17 22:17:56 -07:00
management_endpoints
fix(team_endpoints.py): fix linting error
2025-03-14 20:51:21 -07:00
management_helpers
feat(model_management_endpoints.py): emit audit logs on model delete
2025-03-13 18:48:38 -07:00
openai_files_endpoints
use correct get custom headers
2025-03-12 17:16:51 -07:00
pass_through_endpoints
use correct get custom headers
2025-03-12 17:16:51 -07:00
rerank_endpoints
use correct get custom headers
2025-03-12 17:16:51 -07:00
response_api_endpoints
Add stubbed routes to pass initial auth tests
2025-03-13 16:43:25 -07:00
spend_tracking
Fix batches api cost tracking + Log batch models in spend logs / standard logging payload ( #9077 )
2025-03-08 11:47:25 -08:00
types_utils
build: merge commit 1b15568af7
2025-02-17 21:56:00 -08:00
ui_crud_endpoints
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
vertex_ai_endpoints
vertex ai anthropic thinking param support ( #8853 )
2025-02-26 21:37:18 -08:00
.gitignore
fix(gitmodules): remapping to new proxy
2023-10-12 21:23:53 -07:00
__init__.py
refactor: add black formatting
2023-12-25 14:11:20 +05:30
_logging.py
fix(_logging.py): fix timestamp format for json logs
2024-06-20 15:20:21 -07:00
_new_new_secret_config.yaml
Ollama ssl verify = False + Spend Logs reliability fixes ( #7931 )
2025-01-23 23:05:41 -08:00
_new_secret_config.yaml
fix(redis_cache.py): add 5s default timeout
2025-03-17 14:27:36 -07:00
_super_secret_config.yaml
docs(enterprise.md): cleanup docs
2024-07-15 14:52:08 -07:00
_types.py
Merge pull request #9220 from BerriAI/litellm_qa_responses_api
2025-03-13 21:36:59 -07:00
cached_logo.jpg
Litellm dev 01 23 2025 p2 ( #7962 )
2025-01-23 21:02:15 -08:00
caching_routes.py
(Bug fix) - Cache Health not working when configured with prometheus service logger ( #8687 )
2025-02-20 13:41:56 -08:00
common_request_processing.py
responses_api
2025-03-12 20:38:05 -07:00
custom_sso.py
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
custom_validate.py
build: merge commit 1b15568af7
2025-02-17 21:56:00 -08:00
enterprise
feat(llama_guard.py): add llama guard support for content moderation + new async_moderation_hook
endpoint
2024-02-16 18:45:25 -08:00
health_check.py
fix endpoint_data
2025-03-14 17:21:01 -07:00
lambda.py
Add mangum.
2023-11-23 00:04:47 -05:00
litellm_pre_call_utils.py
LITELLM_METADATA_ROUTES
2025-03-12 18:20:07 -07:00
llamaguard_prompt.txt
feat(llama_guard.py): allow user to define custom unsafe content categories
2024-02-17 17:42:47 -08:00
logo.jpg
(feat) admin ui custom branding
2024-02-21 17:34:42 -08:00
model_config.yaml
Revert "Revert "(feat) Allow using include to include external YAML files in a config.yaml ( #6922 )""
2024-11-27 16:08:59 -08:00
openapi.json
(feat) add swagger.json for litellm proxy
2023-10-13 20:41:04 -07:00
post_call_rules.py
(docs) add example post call rules to proxy
2024-01-15 20:58:50 -08:00
prisma_migration.py
LiteLLM Contributor PRs (02/18/2025). ( #8643 )
2025-02-19 21:52:46 -08:00
proxy_cli.py
(Bug fix) - running litellm proxy on wndows ( #8735 )
2025-02-25 15:19:19 -08:00
proxy_config.yaml
working responses_api
2025-03-12 17:26:12 -07:00
proxy_server.py
expose flag to disable_spend_updates
2025-03-17 20:45:49 -07:00
README.md
[Feat-Proxy] Allow using custom sso handler ( #5809 )
2024-09-20 19:14:33 -07:00
route_llm_request.py
undo changes to route llm request
2025-03-14 21:05:51 -07:00
schema.prisma
feat(endpoints.py): support writing credentials to db
2025-03-10 18:27:43 -07:00
start.sh
fix(factory.py): fixing llama-2 non-chat models prompt templating
2023-11-07 21:33:54 -08:00
utils.py
expose flag to disable_spend_updates
2025-03-17 20:45:49 -07:00