Commit graph

14 commits

Author SHA1 Message Date
Krrish Dholakia
c69193c321 fix: move to using pydantic obj for setting values 2024-07-11 13:18:36 -07:00
Krrish Dholakia
5f93cae3ff feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
Krrish Dholakia
b10f03706d fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
2024-04-24 08:06:07 -07:00
Krrish Dholakia
79978c44ba refactor: add black formatting 2023-12-25 14:11:20 +05:30
maqsoodshaik
3c5b002b90 this commit fixes #883 2023-11-23 12:45:38 +01:00
ishaan-jaff
7bc28f3b1c (fix) pydantic errors with response.time 2023-11-20 18:28:19 -08:00
Krrish Dholakia
e5929f2f7e fix(azure.py-+-proxy_server.py): fix function calling response object + support router on proxy 2023-11-15 13:15:16 -08:00
Krrish Dholakia
4b74ddcb17 refactor: fixing linting issues 2023-11-11 18:52:28 -08:00
ishaan-jaff
485a7ff136 (feat) use usage class for model responses for cohere, hf, tg ai, cohere 2023-10-27 09:58:47 -07:00
Krrish Dholakia
e5279ef99d style: fix linting errors 2023-10-16 17:35:08 -07:00
ishaan-jaff
6e0aa579eb raise vllm error 2023-09-08 15:27:01 -07:00
Krrish Dholakia
e6a65695eb fix linting issues 2023-09-06 20:43:59 -07:00
Krrish Dholakia
14fa57c185 batch completions for vllm now works too 2023-09-06 19:26:19 -07:00
Krrish Dholakia
7290a972e5 adding support for vllm 2023-09-06 18:07:44 -07:00