litellm-mirror/tests/openai_misc_endpoints_tests
2025-02-27 21:20:25 -08:00
..
input.jsonl (fix) LiteLLM Proxy fix GET /files/{file_id:path}/content" endpoint (#7342) 2024-12-20 21:27:45 -08:00
input_azure.jsonl (feat) /batches Add support for using /batches endpoints in OAI format (#7402) 2024-12-24 16:58:05 -08:00
openai_batch_completions.jsonl (fix) LiteLLM Proxy fix GET /files/{file_id:path}/content" endpoint (#7342) 2024-12-20 21:27:45 -08:00
openai_fine_tuning.jsonl (feat) /batches Add support for using /batches endpoints in OAI format (#7402) 2024-12-24 16:58:05 -08:00
out.jsonl (feat) /batches Add support for using /batches endpoints in OAI format (#7402) 2024-12-24 16:58:05 -08:00
out_azure.jsonl (feat) /batches Add support for using /batches endpoints in OAI format (#7402) 2024-12-24 16:58:05 -08:00
test_openai_batches_endpoint.py test_e2e_batches_files 2024-12-28 19:54:04 -08:00
test_openai_files_endpoints.py (fix) LiteLLM Proxy fix GET /files/{file_id:path}/content" endpoint (#7342) 2024-12-20 21:27:45 -08:00
test_openai_fine_tuning.py test_openai_fine_tuning 2025-02-17 21:29:59 -08:00