Krrish Dholakia
|
bbfed59e9c
|
refactor: trigger new bump
|
2024-01-25 16:06:01 -08:00 |
|
Krrish Dholakia
|
d88e190304
|
fix(main.py): fix logging event loop for async logging but sync streaming
|
2024-01-25 15:59:53 -08:00 |
|
Krrish Dholakia
|
dbc929fddb
|
fix(main.py): allow vertex ai project and location to be set in completion() call
|
2024-01-25 15:00:51 -08:00 |
|
Krrish Dholakia
|
806eef02dd
|
fix(main.py): fix order of assembly for streaming chunks
|
2024-01-25 14:51:08 -08:00 |
|
Krrish Dholakia
|
402235dc5d
|
fix(utils.py): fix sagemaker async logging for sync streaming
https://github.com/BerriAI/litellm/issues/1592
|
2024-01-25 12:49:45 -08:00 |
|
Krish Dholakia
|
f1d309d700
|
Merge branch 'main' into litellm_global_spend_updates
|
2024-01-24 20:20:15 -08:00 |
|
Krrish Dholakia
|
327ceb33b7
|
fix(ollama_chat.py): fix default token counting for ollama chat
|
2024-01-24 20:09:17 -08:00 |
|
Krrish Dholakia
|
d536374be0
|
fix(proxy_server.py): track cost for global proxy
|
2024-01-24 16:06:10 -08:00 |
|
Krish Dholakia
|
89e420b243
|
Merge branch 'main' into litellm_reset_key_budget
|
2024-01-23 18:10:32 -08:00 |
|
Krrish Dholakia
|
503ce7020b
|
test(test_keys.py): use correct model name for token counting
|
2024-01-23 17:46:14 -08:00 |
|
Krish Dholakia
|
2ba8863f75
|
Merge pull request #1574 from BerriAI/litellm_fix_streaming_spend_tracking
[WIP] fix(utils.py): fix proxy streaming spend tracking
|
2024-01-23 17:07:40 -08:00 |
|
Krrish Dholakia
|
a5e53271d3
|
fix(utils.py): fix double hashing issue on spend logs, streaming usage metadata logging iss
ue for spend logs
|
2024-01-23 16:14:01 -08:00 |
|
Krrish Dholakia
|
344e232549
|
fix(utils.py): fix proxy streaming spend tracking
|
2024-01-23 15:59:03 -08:00 |
|
Krrish Dholakia
|
88486a3123
|
fix(utils.py): fix streaming cost tracking
|
2024-01-23 14:39:45 -08:00 |
|
ishaan-jaff
|
6e51c213e2
|
(fix) same response_id across chunk
|
2024-01-23 12:56:03 -08:00 |
|
ishaan-jaff
|
28f6a69dbf
|
(fix) sagemaker streaming support
|
2024-01-23 12:31:16 -08:00 |
|
Krrish Dholakia
|
e04a4a7439
|
fix(utils.py): fix content policy violation check for streaming
|
2024-01-23 06:55:04 -08:00 |
|
Ishaan Jaff
|
97dd61a6cb
|
Merge pull request #1561 from BerriAI/litellm_sagemaker_streaming
[Feat] Add REAL Sagemaker streaming
|
2024-01-22 22:10:20 -08:00 |
|
ishaan-jaff
|
09dd1ed68b
|
v0 sagemaker_stream
|
2024-01-22 21:53:16 -08:00 |
|
Krrish Dholakia
|
29fe97b6a9
|
fix(router.py): fix order of dereferenced dictionaries
|
2024-01-22 21:42:25 -08:00 |
|
Krish Dholakia
|
3eaae0e73c
|
Merge pull request #1557 from BerriAI/litellm_emit_spend_logs
feat(utils.py): emit response cost as part of logs
|
2024-01-22 21:02:40 -08:00 |
|
Krrish Dholakia
|
579dfc3013
|
test: fix tests
|
2024-01-22 20:20:17 -08:00 |
|
Krrish Dholakia
|
db2b7bfd4e
|
fix(openai.py): fix linting issue
|
2024-01-22 18:20:15 -08:00 |
|
Krish Dholakia
|
8647f2a665
|
Merge pull request #1556 from BerriAI/litellm_importlib_issue
fix(utils.py): move from pkg_resources to importlib
|
2024-01-22 15:56:07 -08:00 |
|
Krrish Dholakia
|
2ea18785ca
|
feat(utils.py): emit response cost as part of logs
|
2024-01-22 15:53:04 -08:00 |
|
Krrish Dholakia
|
737a5a7b38
|
fix(utils.py): fix debug log
|
2024-01-22 15:15:34 -08:00 |
|
Krrish Dholakia
|
70b0d0307c
|
refactor(utils.py): fix linting errors
|
2024-01-22 15:15:34 -08:00 |
|
Krrish Dholakia
|
e423aeff85
|
fix: support streaming custom cost completion tracking
|
2024-01-22 15:15:34 -08:00 |
|
Krrish Dholakia
|
85b9ad7def
|
fix(main.py): support custom pricing for embedding calls
|
2024-01-22 15:15:34 -08:00 |
|
Krrish Dholakia
|
480c3d3991
|
feat(utils.py): support custom cost tracking per second
https://github.com/BerriAI/litellm/issues/1374
|
2024-01-22 15:15:34 -08:00 |
|
Krrish Dholakia
|
78308ddf91
|
fix(utils.py): move from pkg_resources to importlib
|
2024-01-22 15:05:09 -08:00 |
|
Krrish Dholakia
|
b55dd5aa57
|
fix(utils.py): fix async/sync streaming logging
|
2024-01-22 13:54:51 -08:00 |
|
Krrish Dholakia
|
f7b7dd0b6f
|
fix(utils.py): fix callback logging
|
2024-01-21 00:56:30 -08:00 |
|
Krrish Dholakia
|
59483da18b
|
fix: fix proxy logging
|
2024-01-20 18:22:45 -08:00 |
|
Krrish Dholakia
|
2acdcc6671
|
fix: support info level logging on pkg + proxy
|
2024-01-20 17:45:47 -08:00 |
|
Krrish Dholakia
|
2ecc2f12cd
|
fix(gemini.py): support streaming
|
2024-01-19 20:21:34 -08:00 |
|
Krrish Dholakia
|
b726b04309
|
fix(utils.py): revert exception mapping change
|
2024-01-19 17:39:35 -08:00 |
|
Krrish Dholakia
|
e957f41ab7
|
fix(utils.py): add metadata to logging obj on setup, if exists
|
2024-01-19 17:29:47 -08:00 |
|
ishaan-jaff
|
1d60edad57
|
(fix) async langfuse logger
|
2024-01-19 10:44:51 -08:00 |
|
ishaan-jaff
|
af808d4927
|
(fix) use asyncio run_in_executor
|
2024-01-19 09:52:51 -08:00 |
|
ishaan-jaff
|
8a7b01bfa2
|
(v0) fix
|
2024-01-19 08:51:14 -08:00 |
|
ishaan-jaff
|
d8d1cea69f
|
(feat) support user param for all providers
|
2024-01-18 17:45:59 -08:00 |
|
ishaan-jaff
|
054d3a549e
|
(feat) return Azure enahncements used
|
2024-01-17 18:46:41 -08:00 |
|
Krrish Dholakia
|
d4404fb61e
|
fix(utils.py): fix if check
|
2024-01-17 17:17:58 -08:00 |
|
Krrish Dholakia
|
01a5e80df5
|
fix(utils.py): allow dynamically setting boto3 init and switching between bedrock and openai
|
2024-01-17 15:56:30 -08:00 |
|
Krrish Dholakia
|
cc89aa7456
|
fix(bedrock.py): add support for sts based boto3 initialization
https://github.com/BerriAI/litellm/issues/1476
|
2024-01-17 12:08:59 -08:00 |
|
Krrish Dholakia
|
2180977acf
|
fix(utils.py): mistral optional param mapping
|
2024-01-17 09:44:21 -08:00 |
|
ishaan-jaff
|
83861730b5
|
(feat) improve bedrock, sagemaker exception mapping
|
2024-01-15 21:22:22 -08:00 |
|
ishaan-jaff
|
874358c398
|
(fix) post_call rules
|
2024-01-15 20:56:25 -08:00 |
|
ishaan-jaff
|
b1c93fdc52
|
(feat) post call rules - fail with error message
|
2024-01-15 17:13:13 -08:00 |
|