Commit graph

15260 commits

Author SHA1 Message Date
Krrish Dholakia
0356decdec bump: version 1.42.4 → 1.42.5 2024-07-27 09:16:08 -07:00
Krish Dholakia
9bdcef238b
Merge pull request #4907 from BerriAI/litellm_proxy_get_secret
fix(proxy_server.py): fix get secret for environment_variables
2024-07-26 22:17:11 -07:00
Krish Dholakia
f9c2fec1a6
Merge pull request #4918 from BerriAI/litellm_ollama_tool_calling
feat(ollama_chat.py): support ollama tool calling
2024-07-26 22:16:58 -07:00
Krrish Dholakia
77fe8f57cf docs(ollama.md): add ollama tool calling to docs 2024-07-26 22:12:52 -07:00
Krrish Dholakia
b25d4a8cb3 feat(ollama_chat.py): support ollama tool calling
Closes https://github.com/BerriAI/litellm/issues/4812
2024-07-26 21:51:54 -07:00
Ishaan Jaff
1506e74332 bump: version 1.42.3 → 1.42.4 2024-07-26 21:37:52 -07:00
Ishaan Jaff
36187102d1
Merge pull request #4917 from BerriAI/litellm_link_tomodel_cost
[Feat] Link to https://models.litellm.ai/ on Swagger docs and docs
2024-07-26 21:37:04 -07:00
Ishaan Jaff
f03769e2a4 docs fix link https://models.litellm.ai/ 2024-07-26 21:35:54 -07:00
Ishaan Jaff
2501b4eccd feat link to model cost map on swagger 2024-07-26 21:34:42 -07:00
Krrish Dholakia
a264d1ca8c feat(vertex_httpx.py): support logging citation metadata
Closes https://github.com/BerriAI/litellm/issues/3230
2024-07-26 20:54:59 -07:00
Krrish Dholakia
fe7f78fbf6 feat(vertex_httpx.py): support logging vertex ai safety results to langfuse
Closes https://github.com/BerriAI/litellm/issues/3230
2024-07-26 20:50:43 -07:00
Ishaan Jaff
a7f964b869
Merge pull request #4913 from BerriAI/litellm_fix_error_limit
[Proxy-Fix] - raise more descriptive errors when crossing tpm / rpm limits on keys, user, global limits
2024-07-26 20:25:28 -07:00
Ishaan Jaff
3c463ccbe6
Merge pull request #4914 from BerriAI/litellm_fix_batches
[Proxy-Fix + Test] - /batches endpoint
2024-07-26 20:12:03 -07:00
Krrish Dholakia
fe0b55f2ca fix(utils.py): fix cache hits for streaming
Fixes https://github.com/BerriAI/litellm/issues/4109
2024-07-26 19:04:08 -07:00
Ishaan Jaff
f8b9c7128e docs batches 2024-07-26 18:51:13 -07:00
Ishaan Jaff
90648bee60 docs batches API 2024-07-26 18:50:44 -07:00
Ishaan Jaff
dd37d1d032 use correct link on http://localhost:4000 2024-07-26 18:42:45 -07:00
Ishaan Jaff
f4048bc890 docs batches api 2024-07-26 18:41:53 -07:00
Ishaan Jaff
812dd5e162 test get batches by id 2024-07-26 18:40:10 -07:00
Ishaan Jaff
2541d5f625 add verbose_logger.debug to retrieve batch 2024-07-26 18:26:39 -07:00
Ishaan Jaff
f627fa9b40 fix for GET /v1/batches{batch_id:path} 2024-07-26 18:23:15 -07:00
Ishaan Jaff
12729ceece test - batches endpoint 2024-07-26 18:09:49 -07:00
Ishaan Jaff
56ce7e892d fix batches inserting metadata 2024-07-26 18:08:54 -07:00
Ishaan Jaff
159a880dcc fix /v1/batches POST 2024-07-26 18:06:00 -07:00
Ishaan Jaff
c4e4b4675c fix raise better error when crossing tpm / rpm limits 2024-07-26 17:35:08 -07:00
Krish Dholakia
c0717133a9
Merge pull request #4909 from idris/fix-datadog-attributes
Fix Datadog logging attributes
2024-07-26 14:19:10 -07:00
Idris Mokhtarzada
e8d4234dbd
Better JSON serialization for Datadog logs
Dicts are now properly serialized to JSON so that Datadog can parse the child attributes.  Also, numbers and nulls are sent as numbers and nulls instead of strings.
2024-07-26 17:02:05 -04:00
Idris Mokhtarzada
a7e877d15f
Use milliseconds for response_time in Datadog logs
milliseconds is more commonly used and more standard than seconds
2024-07-26 16:43:21 -04:00
Idris Mokhtarzada
9b89280a90
Use underscores
Datadog does not play nice with special characters (as in "(seconds)").  Also just makes sense to standardize on either underscores or camelCase, but not mix-and-match.
2024-07-26 16:38:54 -04:00
Krrish Dholakia
9943c6d607 fix(proxy_server.py): fix get secret for environment_variables 2024-07-26 13:33:02 -07:00
Krrish Dholakia
b515d4f441 docs(stream.md): add streaming token usage info to docs
Closes https://github.com/BerriAI/litellm/issues/4904
2024-07-26 10:51:17 -07:00
Krrish Dholakia
9a6ed8cabb fix(bedrock_httpx.py): fix streaming error message
Fixes https://github.com/BerriAI/litellm/issues/4900
2024-07-26 10:42:47 -07:00
Krish Dholakia
67115a56c0
Merge pull request #4869 from maamalama/anthropic-tools
Fixed tool_call for Helicone integration
2024-07-26 10:42:10 -07:00
Krrish Dholakia
7ca29d987d docs(docusaurus.config.js): add llm model cost map to docs 2024-07-26 10:07:47 -07:00
Krrish Dholakia
84482703b8 docs(config.md): update wildcard docs 2024-07-26 08:59:53 -07:00
Krrish Dholakia
1d6c39a607 feat(proxy_server.py): handle pydantic mockselvar error
Fixes https://github.com/BerriAI/litellm/issues/4898#issuecomment-2252105485
2024-07-26 08:38:51 -07:00
Krish Dholakia
35737d04d3
Merge pull request #4893 from yujonglee/canary
[Docs] Better search experience with Canary
2024-07-26 08:34:54 -07:00
yujonglee
10ffb5a960 remove ui shift on reload 2024-07-26 22:13:04 +09:00
yujonglee
8a45abb563 fix import and add fallback 2024-07-26 22:00:48 +09:00
yujonglee
c54f23f936 wrap existing search bar 2024-07-26 21:46:36 +09:00
yujonglee
3967007595 update to latest 2024-07-26 21:06:53 +09:00
Krrish Dholakia
afcad9e12c docs(custom_llm_server.md): cleanup docs 2024-07-25 22:45:03 -07:00
Krrish Dholakia
ce210ddaf6 fix(vertex_ai_llama3.py): Fix llama3 streaming issue
Closes https://github.com/BerriAI/litellm/issues/4885
2024-07-25 22:30:55 -07:00
Krrish Dholakia
0ce5a7962e bump: version 1.42.2 → 1.42.3 2024-07-25 22:18:17 -07:00
Krrish Dholakia
2f773d9cb6 fix(litellm_cost_calc/google.py): support meta llama vertex ai cost tracking 2024-07-25 22:12:07 -07:00
Ishaan Jaff
2626cc6d30 bump: version 1.42.1 → 1.42.2 2024-07-25 20:16:05 -07:00
Ishaan Jaff
24fb6fc28d
Merge pull request #4891 from BerriAI/litellm_proxy_support_all_providers
[Feat] Support /* for multiple providers
2024-07-25 20:15:42 -07:00
Ishaan Jaff
079a41fbe1
Merge branch 'main' into litellm_proxy_support_all_providers 2024-07-25 20:15:37 -07:00
Ishaan Jaff
4bf9681df4
Update README.md 2024-07-25 20:12:32 -07:00
Ishaan Jaff
4d513e0b5f
Merge pull request #4897 from BerriAI/docs_add_example_using_anthropic_sdk
Docs add example using anthropic sdk with litellm proxy
2024-07-25 20:11:16 -07:00