Commit graph

34 commits

Author SHA1 Message Date
Krrish Dholakia
3560f0ef2c refactor: move all testing to top-level of repo
Closes https://github.com/BerriAI/litellm/issues/486
2024-09-28 21:08:14 -07:00
Krish Dholakia
0295a22561
LiteLLM Minor Fixes and Improvements (09/10/2024) (#5618)
* fix(cost_calculator.py): move to debug for noisy warning message on cost calculation error

Fixes https://github.com/BerriAI/litellm/issues/5610

* fix(databricks/cost_calculator.py): Handles model name issues for databricks models

* fix(main.py): fix stream chunk builder for multiple tool calls

Fixes https://github.com/BerriAI/litellm/issues/5591

* fix: correctly set user_alias when passed in

Fixes https://github.com/BerriAI/litellm/issues/5612

* fix(types/utils.py): allow passing role for message object

https://github.com/BerriAI/litellm/issues/5621

* fix(litellm_logging.py): Fix langfuse logging across multiple projects

Fixes issue where langfuse logger was re-using the old logging object

* feat(proxy/_types.py): support adding key-based tags for tag-based routing

Enable tag based routing at key-level

* fix(proxy/_types.py): fix inheritance

* test(test_key_generate_prisma.py): fix test

* test: fix test

* fix(litellm_logging.py): return used callback object
2024-09-11 11:30:29 -07:00
Krrish Dholakia
068ee12c30 fix(main.py): safely fail stream_chunk_builder calls 2024-08-10 10:22:26 -07:00
Joe Cheng
1fbfc09b44 Add unit test 2024-08-02 20:51:08 -07:00
Krrish Dholakia
27e9f96380 fix(main.py): fix stream_chunk_builder usage calc
Closes https://github.com/BerriAI/litellm/issues/4496
2024-07-06 14:52:59 -07:00
ishaan-jaff
f3d25d2c27 (test) hidden params in stream_chunk builder 2024-01-13 11:10:23 -08:00
Krrish Dholakia
61f2fe5837 fix(main.py): fix streaming completion token counting error 2024-01-10 23:44:35 +05:30
Krrish Dholakia
3080f27b54 fix(utils.py): raise correct error for azure content blocked error 2024-01-10 23:31:51 +05:30
ishaan-jaff
174248fc71 (test) add back test for counting stream completion tokens 2024-01-06 16:08:32 +05:30
Krrish Dholakia
04c04d62e3 test(test_stream_chunk_builder.py): remove completion assert, the test is for prompt tokens 2024-01-06 14:12:44 +05:30
ishaan-jaff
2bea0c742e (test) completion tokens counting + azure stream 2024-01-03 12:06:39 +05:30
ishaan-jaff
e6a7212d10 (fix) counting streaming prompt tokens - azure 2023-12-29 16:13:52 +05:30
ishaan-jaff
73f60b7315 (test) stream chunk builder - azure prompt tokens 2023-12-29 15:45:41 +05:30
Krrish Dholakia
4905929de3 refactor: add black formatting 2023-12-25 14:11:20 +05:30
Krrish Dholakia
728b879c33 fix(utils.py): fix azure streaming bug 2023-12-04 12:38:22 -08:00
Krrish Dholakia
01c7e18f31 fix(utils.py): include system fingerprint in streaming response object 2023-11-30 08:45:52 -08:00
Krrish Dholakia
e8331a4647 fix(utils.py): azure tool calling streaming 2023-11-27 19:07:38 -08:00
Krrish Dholakia
4cdd930fa2 fix(stream_chunk_builder): adding support for tool calling in completion counting 2023-11-27 18:39:47 -08:00
Krrish Dholakia
c75e90663c test(test_stream_chunk_builder.py): fix setting api key 2023-11-24 11:47:48 -08:00
Krrish Dholakia
4a5dae3941 fix(main.py): fix streaming_chunk_builder to return usage 2023-11-24 11:27:04 -08:00
Krrish Dholakia
c053782d96 refactor(openai.py): support aiohttp streaming 2023-11-09 16:15:30 -08:00
Krrish Dholakia
86ef2a02f7 fix(azure.py): adding support for aiohttp calls on azure + openai 2023-11-09 10:40:33 -08:00
Krrish Dholakia
9bfbdc18fb feat(utils.py): enable returning complete response when stream=true 2023-11-09 09:17:51 -08:00
ishaan-jaff
d492bca05e (test) test_stream_chunk_builder 2023-11-01 14:54:00 -07:00
ishaan-jaff
70885bdba6 (test) stream chunk builder 2023-11-01 08:38:19 -07:00
Krrish Dholakia
9cda24e1b2 fix(utils): adds complete streaming response to success handler 2023-10-07 15:42:00 -07:00
Krrish Dholakia
5a19ee1a71 fix get optional params 2023-10-02 12:02:53 -07:00
WilliamEspegren
404af1be0f
if "function_call" find name 2023-09-17 19:02:50 +02:00
WilliamEspegren
5544a9251f
rebuild chunks to openAI response
Rebuild the cunks, but does not include the "usage"
2023-09-17 13:07:54 +02:00
WilliamEspegren
e07b3f5bfe
append chunk to chunks 2023-09-17 10:29:09 +02:00
WilliamEspegren
fe37f97423
add empty chunks list 2023-09-17 10:28:37 +02:00
WilliamEspegren
a39a204457
import stream_chunk_builder 2023-09-17 10:28:01 +02:00
WilliamEspegren
71afdbc0e5
add max_tokens 2023-09-17 10:27:32 +02:00
WilliamEspegren
43a18b9528
add stream_chunk_builder function 2023-09-17 10:26:34 +02:00