llms
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
proxy
|
fix(proxy/): remove cloned repo
|
2023-10-12 21:46:18 -07:00 |
__init__.py
|
fix(init.py): fix linting errors
|
2023-10-12 21:31:53 -07:00 |
_version.py
|
formatting improvements
|
2023-08-28 09:20:50 -07:00 |
budget_manager.py
|
remove budget manager
|
2023-09-30 11:42:56 -07:00 |
caching.py
|
add hosted api.litellm.ai for caching
|
2023-10-02 10:27:18 -07:00 |
config.json
|
new config.json
|
2023-09-01 14:16:12 -07:00 |
cost.json
|
store llm costs in budget manager
|
2023-09-09 19:11:35 -07:00 |
exceptions.py
|
add contributor message to code
|
2023-09-25 10:00:10 -07:00 |
gpt_cache.py
|
fix caching
|
2023-08-28 14:53:41 -07:00 |
main.py
|
(fix) Ollama use new streaming format
|
2023-10-11 17:00:39 -07:00 |
testing.py
|
add contributor message to code
|
2023-09-25 10:00:10 -07:00 |