Krrish Dholakia
ab747c8fe9
fix(utils.py): fix replicate completion cost calculation
2024-05-17 22:18:57 -07:00
Ishaan Jaff
cdfa9c9232
fix - cooldown based on exception header
2024-05-17 18:52:45 -07:00
Krish Dholakia
60615f46c4
Merge branch 'main' into litellm_bedrock_anthropic_fix
2024-05-17 17:47:32 -07:00
Krrish Dholakia
b20f4f65b4
fix(bedrock_httpx.py): raise better timeout exception
2024-05-17 17:16:36 -07:00
Krrish Dholakia
54b4e24427
fix(utils.py): exception map bedrock error
2024-05-17 16:18:25 -07:00
Krrish Dholakia
9ab2389b7e
feat(proxy_server.py): enable custom branding + routes on openapi docs
...
Allows user to add their branding + show only openai routes on docs
2024-05-17 15:21:29 -07:00
Krrish Dholakia
963b207473
fix(utils.py): support openrouter function calling
2024-05-17 08:02:24 -07:00
Mikkel Gravgaard
13b7dba006
Merge branch 'main' into patch-1
2024-05-17 10:26:14 +02:00
Krrish Dholakia
13e4196e3e
fix(bedrock_httpx.py): add async support for bedrock amazon, meta, mistral models
2024-05-16 22:39:25 -07:00
Krrish Dholakia
8409b39f0d
fix(bedrock_httpx.py): move bedrock ai21 calls to being async
2024-05-16 22:21:30 -07:00
Krrish Dholakia
118fc4ffac
fix(bedrock_httpx.py): move anthropic bedrock calls to httpx
...
Fixing https://github.com/BerriAI/litellm/issues/2921
2024-05-16 21:51:55 -07:00
lj
eee9be353a
Removed config dict type definition
2024-05-17 10:39:00 +08:00
Ishaan Jaff
b722dfd0ce
Merge pull request #3705 from BerriAI/litellm_add_cost_tracking_for_ft_models
...
[FEAT] add cost tracking for Fine Tuned OpenAI `ft:davinci-002` and `ft:babbage-002`
2024-05-16 17:37:35 -07:00
Ishaan Jaff
b9cda67bb8
fix add cost tracking for OpenAI ft models
2024-05-16 17:31:19 -07:00
Krrish Dholakia
e41897808d
fix(replicate.py): move replicate calls to being completely async
...
Closes https://github.com/BerriAI/litellm/issues/3128
2024-05-16 17:24:08 -07:00
Krrish Dholakia
782b44818c
fix(utils.py): allow passing in custom pricing to completion_cost as params
2024-05-16 16:24:44 -07:00
Krrish Dholakia
dc00b4a7ed
fix(utils.py): update completion_cost docstring
2024-05-16 15:47:40 -07:00
Krrish Dholakia
1f99fa7411
fix(utils.py): fix logging level of error message
2024-05-16 15:44:08 -07:00
Ishaan Jaff
3a52073b48
Merge pull request #3682 from BerriAI/litellm_token_counter_endpoint
...
[Feat] `token_counter` endpoint
2024-05-16 13:39:23 -07:00
Ishaan Jaff
a3763c8608
feat - try using hf tokenizer
2024-05-16 10:59:29 -07:00
Krrish Dholakia
d167a9ea99
feat(lago.py): adding support for usage-based billing with lago
...
Closes https://github.com/BerriAI/litellm/issues/3639
2024-05-16 10:54:18 -07:00
lj
3620c6fc1a
Update model config in utils.py
2024-05-16 16:39:37 +08:00
Ishaan Jaff
c6e91daad7
Merge pull request #3543 from kmheckel/main
...
Updated Ollama cost models to include LLaMa3 and Mistral/Mixtral Instruct series
2024-05-15 20:50:50 -07:00
Ishaan Jaff
5597efa587
Merge pull request #3662 from BerriAI/litellm_feat_predibase_exceptions
...
[Fix] Mask API Keys from Predibase AuthenticationErrors
2024-05-15 20:45:40 -07:00
Krish Dholakia
d294e26fdb
Merge pull request #3660 from BerriAI/litellm_proxy_ui_general_settings
...
feat(proxy_server.py): Enabling Admin to control general settings on proxy ui
2024-05-15 20:36:42 -07:00
Krrish Dholakia
3f339cb694
fix(parallel_request_limiter.py): fix max parallel request limiter on retries
2024-05-15 20:16:11 -07:00
Ishaan Jaff
d399e09655
fix utils.py
2024-05-15 19:54:52 -07:00
Ishaan Jaff
9782d7060e
feat - predibase exceptions
2024-05-15 16:52:33 -07:00
Ishaan Jaff
b645d8dcf9
fix - show litellm_debug_info
2024-05-15 13:07:04 -07:00
Krrish Dholakia
c098ad0a60
fix(alerting.py): fix datetime comparison logic
2024-05-14 22:10:09 -07:00
Krrish Dholakia
7ab7e81aed
fix(utils.py): default claude-3 to tiktoken (0.8s faster than hf tokenizer)
2024-05-14 18:37:14 -07:00
Krish Dholakia
8e59bbd573
Revert "Logfire Integration"
2024-05-14 17:38:47 -07:00
Krrish Dholakia
c6ca76e265
fix(utils.py): fix pydantic v1 error
2024-05-14 17:17:20 -07:00
Krrish Dholakia
d220cd8626
fix(utils.py): add lru-cache logic to _select_tokenizer
...
speed up tokenizer load times
2024-05-14 16:39:50 -07:00
alisalim17
2ddcb9bfd9
Merge remote-tracking branch 'upstream/main'
2024-05-14 22:32:57 +04:00
Krrish Dholakia
6c7c9c4aef
fix(utils.py): fix python 3.8 linting error
2024-05-14 11:25:36 -07:00
alisalim17
25518fb995
refactor: logging class to use continue instead of break for streaming logging
2024-05-14 21:21:21 +04:00
alisalim17
f2a4398a39
chore: fix typo
2024-05-14 21:18:00 +04:00
alisalim17
bac6b3beb7
Merge remote-tracking branch 'upstream/main'
2024-05-14 18:42:20 +04:00
Krish Dholakia
bef8a7f1d9
Merge pull request #3600 from msabramo/msabramo/fix-pydantic-warnings
...
Update pydantic code to fix warnings
2024-05-13 22:00:39 -07:00
Krish Dholakia
3645c89fb5
Merge pull request #3602 from msabramo/msabramo/fix_pkg_resources_warning
...
Fix `pkg_resources` warning
2024-05-13 21:59:52 -07:00
Krrish Dholakia
f8e1b1db2e
refactor(utils.py): trigger local_testing
2024-05-13 18:18:22 -07:00
Krrish Dholakia
ace5ce0b78
fix(utils.py): fix watsonx exception mapping
2024-05-13 18:13:13 -07:00
Krrish Dholakia
bf8d3be791
fix(utils.py): watsonx ai exception mapping fix
2024-05-13 17:11:33 -07:00
Krrish Dholakia
ca641d0a24
fix(utils.py): handle api assistant returning 'null' role
...
Fixes https://github.com/BerriAI/litellm/issues/3621
2024-05-13 16:46:07 -07:00
Krrish Dholakia
8d94665842
fix(utils.py): fix custom pricing when litellm model != response obj model name
2024-05-13 15:25:35 -07:00
Krrish Dholakia
96336cdd49
fix(openai.py): creat MistralConfig with response_format mapping for mistral api
2024-05-13 13:29:58 -07:00
Krrish Dholakia
a907247033
fix(utils.py): fix vertex ai function calling + streaming
...
Completes https://github.com/BerriAI/litellm/issues/3147
2024-05-13 12:32:39 -07:00
Marc Abramowitz
b61cc97771
Merge branch 'msabramo/pydantic_replace_root_validator_with_model_validator' into msabramo/fix-pydantic-warnings
2024-05-13 11:25:55 -07:00
Marc Abramowitz
09829c7c78
Fix pkg_resources warning
...
by trying to use `importlib.resources` first and falling back to
`pkg_resources` if that fails.
With this and the changes in GH-3600 and GH-3601, the tests pass with **zero
warnings**!! 🎉 🎉
```shell
abramowi at marcs-mbp-3 in ~/Code/OpenSource/litellm (msabramo/fix-pydantic-warnings●●)
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py
====================================== test session starts ======================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, mock-3.14.0
collected 12 items
litellm/tests/test_proxy_server.py s..........s [100%]
================================= 10 passed, 2 skipped in 9.24s =================================
```
2024-05-12 12:46:24 -07:00