wslee
|
40bb165108
|
support dynamic api base
|
2024-07-25 11:14:38 +09:00 |
|
wslee
|
dd10da4d46
|
add support for friendli dedicated endpoint
|
2024-07-25 11:14:35 +09:00 |
|
Ishaan Jaff
|
8ea4b73c27
|
add UnsupportedParamsError to litellm exceptions
|
2024-07-24 12:20:14 -07:00 |
|
Krrish Dholakia
|
7df94100e8
|
build(model_prices_and_context_window.json): add model pricing for vertex ai llama 3.1 api
|
2024-07-23 17:36:07 -07:00 |
|
Krrish Dholakia
|
83ef52e180
|
feat(vertex_ai_llama.py): vertex ai llama3.1 api support
Initial working commit for vertex ai llama 3.1 api support
|
2024-07-23 17:07:30 -07:00 |
|
Krrish Dholakia
|
f64a3309d1
|
fix(utils.py): support raw response headers for streaming requests
|
2024-07-23 11:58:58 -07:00 |
|
Krrish Dholakia
|
dcb974dd1e
|
feat(utils.py): support passing openai response headers to client, if enabled
Allows openai/openai-compatible provider response headers to be sent to client, if 'return_response_headers' is enabled
|
2024-07-23 11:30:52 -07:00 |
|
Ishaan Jaff
|
1355932bf4
|
Merge pull request #3905 from giritatavarty-8451/litellm_triton_chatcompletion_support
Litellm triton chatcompletion support - Resubmit of #3895
|
2024-07-23 10:30:26 -07:00 |
|
Ishaan Jaff
|
8f9638f2c1
|
fix raise correct provider on content policy violation
|
2024-07-22 16:03:15 -07:00 |
|
Ishaan Jaff
|
15c109f023
|
fix checking if _known_custom_logger_compatible_callbacks
|
2024-07-22 15:43:43 -07:00 |
|
Krrish Dholakia
|
f10af7596c
|
fix(utils.py): allow dropping extra_body in additional_drop_params
Fixes https://github.com/BerriAI/litellm/issues/4769
|
2024-07-20 19:12:58 -07:00 |
|
Ishaan Jaff
|
5e4d291244
|
rename to _response_headers
|
2024-07-20 17:31:16 -07:00 |
|
Ishaan Jaff
|
46cf4f69ae
|
return response headers in response
|
2024-07-20 14:59:08 -07:00 |
|
Krish Dholakia
|
3053f52c43
|
Merge pull request #4801 from BerriAI/litellm_dynamic_params_oai_compatible_endpoints
fix(utils.py): support dynamic params for openai-compatible providers
|
2024-07-19 21:07:06 -07:00 |
|
Krrish Dholakia
|
95a0f6839f
|
fix(utils.py): fix token_counter to handle empty tool calls in messages
Fixes https://github.com/BerriAI/litellm/pull/4749
|
2024-07-19 19:39:00 -07:00 |
|
Krrish Dholakia
|
e45956d77e
|
fix(utils.py): fix get_llm_provider to support dynamic params for openai-compatible providers
|
2024-07-19 19:36:31 -07:00 |
|
Krrish Dholakia
|
e2d275f1b7
|
fix(utils.py): add exception mapping for bedrock image internal server error
|
2024-07-19 19:30:41 -07:00 |
|
Sophia Loris
|
d779253949
|
resolve merge conflicts
|
2024-07-19 09:45:53 -05:00 |
|
Sophia Loris
|
d5c65c6be2
|
Add support for Triton streaming & triton async completions
|
2024-07-19 09:35:27 -05:00 |
|
Krrish Dholakia
|
b23a633cf1
|
fix(utils.py): fix status code in exception mapping
|
2024-07-18 18:04:59 -07:00 |
|
Krish Dholakia
|
57f6923ab6
|
Merge pull request #4729 from vingiarrusso/vgiarrusso/guardrails
Add enabled_roles to Guardrails configuration, Update Lakera guardrail moderation hook
|
2024-07-17 22:24:35 -07:00 |
|
Ishaan Jaff
|
ee53b9093b
|
Merge pull request #4758 from BerriAI/litellm_langsmith_async_support
[Feat] Use Async Httpx client for langsmith logging
|
2024-07-17 16:54:40 -07:00 |
|
Ishaan Jaff
|
5f04f7b7c1
|
fix langsmith logging for streaming
|
2024-07-17 16:04:45 -07:00 |
|
Ishaan Jaff
|
9c00fb64c4
|
use langsmith as a custom callback class
|
2024-07-17 15:35:13 -07:00 |
|
Krrish Dholakia
|
a176feeacc
|
fix(utils.py): return optional params from groq
|
2024-07-17 12:09:08 -07:00 |
|
Krrish Dholakia
|
4cf293395b
|
fix(utils.py): fix linting error
|
2024-07-16 20:57:34 -07:00 |
|
Krrish Dholakia
|
155ba055ee
|
fix(utils.py): fix get_api_base to use vertexai_anthropic
|
2024-07-16 19:17:45 -07:00 |
|
Ishaan Jaff
|
3981be6a99
|
fix install on python 3.8
|
2024-07-16 17:00:32 -07:00 |
|
Ishaan Jaff
|
95af5c260e
|
fix installing on python3.8
|
2024-07-16 16:56:15 -07:00 |
|
Vinnie Giarrusso
|
b83f47e941
|
refactor a bit
|
2024-07-16 12:19:31 -07:00 |
|
Krish Dholakia
|
0fb88e527c
|
Merge pull request #4716 from pamelafox/countfuncs
Add token counting for OpenAI tools/tool_choice
|
2024-07-16 07:21:31 -07:00 |
|
Vinnie Giarrusso
|
6ff863ee00
|
Add enabled_roles to Guardrails configuration, Update Lakera guardrail moderation hook
|
2024-07-16 01:52:08 -07:00 |
|
Krrish Dholakia
|
a15ba2592a
|
fix(utils.py): allow passing dynamic api base for openai-compatible endpoints
|
2024-07-15 20:00:44 -07:00 |
|
Krrish Dholakia
|
959c627dd3
|
fix(litellm_logging.py): log response_cost=0 for failed calls
Fixes https://github.com/BerriAI/litellm/issues/4604
|
2024-07-15 19:25:56 -07:00 |
|
Krrish Dholakia
|
9cc2daeec9
|
fix(utils.py): update get_model_info docstring
Fixes https://github.com/BerriAI/litellm/issues/4711
|
2024-07-15 18:18:50 -07:00 |
|
Pamela Fox
|
d0fe1a8906
|
Docstring
|
2024-07-15 11:12:42 -07:00 |
|
Pamela Fox
|
8d01f91056
|
Less changes
|
2024-07-15 11:11:21 -07:00 |
|
Pamela Fox
|
a2188a869e
|
Less changes
|
2024-07-15 11:09:45 -07:00 |
|
Pamela Fox
|
d43dbc756b
|
Count tokens for tools
|
2024-07-15 11:07:52 -07:00 |
|
Krrish Dholakia
|
b1be355d42
|
build(model_prices_and_context_window.json): add azure ai jamba instruct pricing + token details
Adds jamba instruct, mistral, llama3 pricing + token info for azure_ai
|
2024-07-13 16:34:31 -07:00 |
|
Ishaan Jaff
|
23cccba070
|
fix str from BadRequestError
|
2024-07-13 09:54:32 -07:00 |
|
Krrish Dholakia
|
cff66d6151
|
fix(proxy_server.py): fix linting errors
|
2024-07-11 22:12:33 -07:00 |
|
Ishaan Jaff
|
8dbf0a634a
|
fix supports vision test
|
2024-07-11 21:14:25 -07:00 |
|
Krrish Dholakia
|
b2e46086dd
|
fix(utils.py): fix recreating model response object when stream usage is true
|
2024-07-11 21:01:12 -07:00 |
|
Ishaan Jaff
|
8bf50ac5db
|
Merge pull request #4661 from BerriAI/litellm_fix_mh
[Fix] Model Hub - Show supports vision correctly
|
2024-07-11 15:03:37 -07:00 |
|
Ishaan Jaff
|
341f88d191
|
fix supports vision
|
2024-07-11 12:59:42 -07:00 |
|
Krrish Dholakia
|
1ba3fcc3fb
|
feat(utils.py): accept 'api_key' as param for validate_environment
Closes https://github.com/BerriAI/litellm/issues/4375
|
2024-07-11 12:02:23 -07:00 |
|
Krrish Dholakia
|
1019355527
|
fix(types/utils.py): fix streaming function name
|
2024-07-10 21:56:47 -07:00 |
|
Yulong Liu
|
feb42c91a6
|
remove print
|
2024-07-08 17:03:07 -07:00 |
|
Yulong Liu
|
cb025a7f26
|
Merge branch 'main' into empower-functions-v1
|
2024-07-08 17:01:15 -07:00 |
|