.. |
adapters
|
feat(proxy_server.py): working /v1/messages endpoint
|
2024-07-10 18:15:38 -07:00 |
assistants
|
add async assistants delete support
|
2024-07-10 11:14:40 -07:00 |
batches
|
fix(batches/main.py): fix linting error
|
2024-07-19 18:26:13 -07:00 |
deprecated_litellm_server
|
|
|
files
|
fix(files/main.py): fix linting error
|
2024-07-19 15:50:25 -07:00 |
integrations
|
langsmith - support logging tags
|
2024-07-24 07:08:40 -07:00 |
litellm_core_utils
|
Merge branch 'main' into litellm_braintrust_integration
|
2024-07-22 22:40:39 -07:00 |
llms
|
Merge branch 'main' into bedrock-llama3.1-405b
|
2024-07-25 19:29:10 -07:00 |
proxy
|
Merge branch 'main' into litellm_redis_team_object
|
2024-07-25 19:31:52 -07:00 |
router_strategy
|
control using enable_tag_filtering
|
2024-07-18 22:40:51 -07:00 |
router_utils
|
Revert "[Ui] add together AI, Mistral, PerplexityAI, OpenRouter models on Admin UI "
|
2024-07-20 19:04:22 -07:00 |
tests
|
Merge branch 'main' into litellm_redis_team_object
|
2024-07-25 19:31:52 -07:00 |
types
|
feat(custom_llm.py): initial working commit for writing your own custom LLM handler
|
2024-07-25 15:33:05 -07:00 |
__init__.py
|
feat(utils.py): support sync streaming for custom llm provider
|
2024-07-25 16:47:32 -07:00 |
_logging.py
|
fix(_logging.py): fix timestamp format for json logs
|
2024-06-20 15:20:21 -07:00 |
_redis.py
|
feat(proxy_server.py): return litellm version in response headers
|
2024-05-08 16:00:08 -07:00 |
_service_logger.py
|
fix(_service_logging.py): only trigger otel if in service_callback
|
2024-07-03 09:48:38 -07:00 |
_version.py
|
|
|
budget_manager.py
|
feat(proxy_server.py): return litellm version in response headers
|
2024-05-08 16:00:08 -07:00 |
caching.py
|
add doc string to explain what delete cache does
|
2024-07-13 12:25:31 -07:00 |
cost.json
|
|
|
cost_calculator.py
|
fix(litellm_logging.py): log response_cost=0 for failed calls
|
2024-07-15 19:25:56 -07:00 |
exceptions.py
|
feat use UnsupportedParamsError as litellm error type
|
2024-07-24 12:19:10 -07:00 |
main.py
|
fix(custom_llm.py): pass input params to custom llm
|
2024-07-25 19:03:52 -07:00 |
model_prices_and_context_window_backup.json
|
Merge branch 'main' into bedrock-llama3.1-405b
|
2024-07-25 19:29:10 -07:00 |
py.typed
|
feature - Types for mypy - #360
|
2024-05-30 14:14:41 -04:00 |
requirements.txt
|
|
|
router.py
|
fix(router.py): add support for diskcache to router
|
2024-07-25 14:30:46 -07:00 |
scheduler.py
|
feat(scheduler.py): support redis caching for req. prioritization
|
2024-06-06 14:19:21 -07:00 |
timeout.py
|
|
|
utils.py
|
Merge branch 'main' into bedrock-llama3.1-405b
|
2024-07-25 19:29:10 -07:00 |