__pycache__
|
style(proxy_cli.py): adding feedback box
|
2023-10-10 13:49:54 -07:00 |
llms
|
fix(init.py): expose complete client session
|
2023-10-10 15:16:10 -07:00 |
proxy
|
refactor(proxy_cli): adding additional tracing
|
2023-10-10 18:12:31 -07:00 |
__init__.py
|
fix(init.py): expose complete client session
|
2023-10-10 15:16:10 -07:00 |
_version.py
|
formatting improvements
|
2023-08-28 09:20:50 -07:00 |
budget_manager.py
|
remove budget manager
|
2023-09-30 11:42:56 -07:00 |
caching.py
|
add hosted api.litellm.ai for caching
|
2023-10-02 10:27:18 -07:00 |
config.json
|
new config.json
|
2023-09-01 14:16:12 -07:00 |
cost.json
|
store llm costs in budget manager
|
2023-09-09 19:11:35 -07:00 |
exceptions.py
|
add contributor message to code
|
2023-09-25 10:00:10 -07:00 |
gpt_cache.py
|
fix caching
|
2023-08-28 14:53:41 -07:00 |
testing.py
|
add contributor message to code
|
2023-09-25 10:00:10 -07:00 |