litellm-mirror/litellm/proxy/tests
Krrish Dholakia 49e8cdbff9 fix(router.py): check for context window error when handling 400 status code errors
was causing proxy context window fallbacks to not work as expected
2024-03-26 08:08:15 -07:00
..
llama_index_data (test) llama index VectorStoreIndex 2024-02-09 16:49:03 -08:00
bursty_load_test_completion.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
error_log.txt (test) load test embedding: proxy 2023-11-24 17:14:44 -08:00
large_text.py fix(router.py): check for context window error when handling 400 status code errors 2024-03-26 08:08:15 -07:00
load_test_completion.py (fix) add some better load testing 2024-03-22 19:48:54 -07:00
load_test_embedding.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
load_test_embedding_100.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
load_test_embedding_proxy.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
load_test_q.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
request_log.txt (test) load test embedding: proxy 2023-11-24 17:14:44 -08:00
test_async.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
test_langchain_request.py (test) proxy - log metadata to langfuse 2024-01-01 11:54:16 +05:30
test_llamaindex.py (test) llama index VectorStoreIndex 2024-02-09 16:49:03 -08:00
test_openai_exception_request.py (test) proxy - add openai exception mapping error 2024-01-15 09:56:20 -08:00
test_openai_js.js (test) large request 2024-02-12 21:49:47 -08:00
test_openai_request.py (docs) also test gpt-4 vision enhancements 2024-01-17 18:46:41 -08:00
test_q.py refactor: add black formatting 2023-12-25 14:11:20 +05:30