Commit graph

115 commits

Author SHA1 Message Date
Krrish Dholakia
a7779796ef fix(anthropic.py): support openai system message being a list 2024-07-23 21:45:56 -07:00
Ishaan Jaff
344010e127 Pass litellm proxy specific metadata 2024-07-23 15:31:30 -07:00
Krrish Dholakia
48dd21cc88 fix(anthropic.py): fix streaming client 2024-07-19 18:55:00 -07:00
Krrish Dholakia
fd9880ebbe feat(vertex_ai_anthropic.py): support response_schema for vertex ai anthropic calls
allows passing response_schema for anthropic calls. supports schema validation.
2024-07-18 22:40:35 -07:00
Ishaan Jaff
a72cff8ad6 anthropic - raise Authentication error when no api key provided 2024-07-16 20:42:43 -07:00
Krrish Dholakia
c69193c321 fix: move to using pydantic obj for setting values 2024-07-11 13:18:36 -07:00
Krrish Dholakia
c46e3ce590 fix: fix linting error 2024-07-10 22:14:23 -07:00
Krrish Dholakia
48be4ce805 feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
4ba30abb63 feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
2024-07-10 18:15:38 -07:00
Krrish Dholakia
01a335b4c3 feat(anthropic_adapter.py): support for translating anthropic params to openai format 2024-07-10 00:32:28 -07:00
Ishaan Jaff
5906b60c9a fix - raise report Anthropic streaming errors 2024-07-05 15:31:06 -07:00
Igor Drozdov
97ee9dcf8b fix(anthropic.py): add index to streaming tool use 2024-07-05 12:23:58 +02:00
Krrish Dholakia
00497b408d fix(anthropic.py): fix anthropic tool calling + streaming
Fixes https://github.com/BerriAI/litellm/issues/4537
2024-07-04 16:30:24 -07:00
Krrish Dholakia
dca952f117 fix: linting fixes 2024-07-03 21:55:00 -07:00
Krish Dholakia
06c6c65d2a Merge branch 'main' into litellm_anthropic_tool_calling_streaming_fix 2024-07-03 20:43:51 -07:00
Krrish Dholakia
eae049d059 fix(anthropic.py): support *real* anthropic tool calling + streaming
Parses each chunk and translates to openai format
2024-07-03 19:48:35 -07:00
Krish Dholakia
4e1b247c1d Revert "fix(vertex_anthropic.py): Vertex Anthropic tool calling - native params " 2024-07-03 17:55:37 -07:00
Krrish Dholakia
46800ba20a fix(vertex_anthropic.py): Updates the vertex anthropic endpoint to do tool calling with the anthropic api params 2024-07-03 15:28:31 -07:00
Krrish Dholakia
a765bae2b6 fix(http_handler.py): raise more detailed http status errors 2024-06-28 15:12:38 -07:00
Krish Dholakia
fa2d8bc794 Merge pull request #4216 from BerriAI/litellm_refactor_logging
refactor(utils.py): Cut down utils.py to <10k lines.
2024-06-15 15:19:42 -07:00
Krrish Dholakia
9d7f5d503c refactor(utils.py): refactor Logging to it's own class. Cut down utils.py to <10k lines.
Easier debugging

 Reference: https://github.com/BerriAI/litellm/issues/4206
2024-06-15 10:57:20 -07:00
Ishaan Jaff
113b7e34ce refactor to use _get_async_httpx_client 2024-06-14 21:30:42 -07:00
Ishaan Jaff
d19efd62cf fix async client 2024-06-14 21:12:32 -07:00
Ishaan Jaff
56cd405972 cache anthropic httpx client 2024-06-14 20:55:40 -07:00
Krrish Dholakia
8b1b0f6f70 fix(anthropic.py): fix anthropic async streaming
pass the 'stream' param to the httpx call

Addresses https://github.com/BerriAI/litellm/issues/3728#issuecomment-2143985104
2024-06-02 16:01:44 -07:00
Krrish Dholakia
ccee9e4eb1 fix(anthropic.py): fix parallel streaming on anthropic.py
prevent parallel requests from cancelling each other

Fixes https://github.com/BerriAI/litellm/issues/3881
2024-05-28 16:29:09 -07:00
Krrish Dholakia
4795c56f84 feat(anthropic.py): support anthropic 'tool_choice' param
Closes https://github.com/BerriAI/litellm/issues/3752
2024-05-21 17:50:44 -07:00
Krrish Dholakia
8405fee205 fix(anthropic.py): bump default anthropic api version for tool use 2024-05-17 00:41:11 -07:00
Ishaan Jaff
32815b06cb feat: Anthropic allow users to set anthropic-beta in headers 2024-05-16 14:40:31 -07:00
Krrish Dholakia
9b10ba649f fix(anthropic.py): fix tool calling + streaming issue 2024-05-11 20:15:36 -07:00
Krrish Dholakia
2b1c22f088 fix(anthropic.py): compatibility fix 2024-05-11 19:51:29 -07:00
Krrish Dholakia
cfab989abf fix(anthropic.py): fix version compatibility 2024-05-11 19:46:26 -07:00
Krrish Dholakia
6018c8ab77 fix(anthropic.py): fix linting error 2024-05-11 19:42:14 -07:00
Krrish Dholakia
bd0c3a81cb fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
Krrish Dholakia
5f93cae3ff feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
Krrish Dholakia
344353d363 fix(anthropic.py): remove raise error on 'empty content'
Fixes https://github.com/BerriAI/litellm/issues/3453
2024-05-06 11:42:09 -07:00
Krrish Dholakia
80a1344c55 fix(utils.py): anthropic error handling 2024-05-06 07:25:12 -07:00
Krrish Dholakia
4efaacc5ce fix(anthropic.py): handle whitespace characters for anthropic calls 2024-05-03 17:31:34 -07:00
Krrish Dholakia
d06702cbf4 fix(anthropic.py): drop unsupported non-whitespace character value when calling anthropic with stop sequences
Fixes https://github.com/BerriAI/litellm/issues/3286
2024-05-03 16:59:49 -07:00
Josh Mandel
7115f74ca6 fix: Stream completion responses from anthropic. (Fix 3129) 2024-04-19 16:13:19 -05:00
Ishaan Jaff
833a64455f ci/cd run async handler 2024-04-06 19:16:27 -07:00
Ishaan Jaff
1dc5b01e01 fix - use anthropic class for clients 2024-04-06 18:19:28 -07:00
Ishaan Jaff
3c10bfb497 async streaming anthropic 2024-04-06 17:53:06 -07:00
Ishaan Jaff
32c3aab34e feat - make anthropic async 2024-04-06 15:50:13 -07:00
Krish Dholakia
9912a80190 Merge pull request #2855 from Caixiaopig/fix_update_default_claude3_maxtokens
Updating the default Anthropic Officlal Claude 3 max_tokens to 4096
2024-04-06 08:39:55 -07:00
Caixiaopig
aa2a1389da Updating the default Anthropic Officlal Claude 3 max_tokens to 4096
fix bug
2024-04-05 09:45:57 -05:00
Zihao Li
017abaa452 Clean up imports of XML processing functions 2024-04-05 22:36:18 +08:00
Zihao Li
4d0975bf4e Move tool definitions from system prompt to parameter and refactor tool calling parse 2024-04-05 16:01:40 +08:00
Caixiaopig
7eb9abba2b Updating the default Anthropic Claude 3 max_tokens to 4096
The default value of max_tokens used to be 256. If the client does not set a larger value, the model's output may be truncated, so the default value has been changed to 4096. This value is also the maximum output value described in the official interface.
see: https://docs.anthropic.com/claude/reference/messages_post
2024-04-05 14:44:40 +08:00
Krrish Dholakia
69f27aa25c fix(factory.py): parse list in xml tool calling response (anthropic)
improves tool calling outparsing to check if list in response. Also returns the raw response back to the user via `response._hidden_params["original_response"]`, so user can see exactly what anthropic returned
2024-03-29 11:51:26 -07:00