Ishaan Jaff
daeafc40f9
Merge pull request #4652 from msabramo/shorter-success_callbacks-in-health-readiness-response
...
Shorter success callbacks from `/health/readiness`
2024-07-11 09:57:52 -07:00
Ishaan Jaff
ad7ff0d188
Merge pull request #4656 from BerriAI/litellm_otel_fix
...
[Proxy - OTEL] Fix logging DB, Redis Cache Reads
2024-07-11 09:55:51 -07:00
Ishaan Jaff
dcaffb2a1e
test otel
2024-07-11 09:54:23 -07:00
Krrish Dholakia
7b38278e69
docs(model_management.md): update docs to clarify calling /model/info
2024-07-11 09:47:50 -07:00
Ishaan Jaff
07a54faab1
fix add master key in requests
2024-07-11 09:05:08 -07:00
Ishaan Jaff
97da1252f1
test- otel span recording
2024-07-11 08:47:16 -07:00
Ishaan Jaff
7d4407132a
add otel in callbacks
2024-07-11 07:24:48 -07:00
Ishaan Jaff
90ece85862
fix - otel log db / redis calls
2024-07-11 07:22:45 -07:00
Krish Dholakia
f4d140efec
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
...
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
48be4ce805
feat(proxy_server.py): working /v1/messages
with config.yaml
...
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Marc Abramowitz
9d2cfe6933
Shorter success callbacks from /health/readiness
...
Before:
```shell
$ curl -sSL http://0.0.0.0:4000/health/readiness | jq '.success_callbacks'
[
"langfuse",
"<function _PROXY_track_cost_callback at 0x12fc14b80>",
"<bound method SlackAlerting.response_taking_too_long_callback of <litellm.integrations.slack_alerting.SlackAlerting object at 0x12cedb740>>",
"<litellm.proxy.hooks.parallel_request_limiter._PROXY_MaxParallelRequestsHandler object at 0x12cedb8f0>",
"<litellm.proxy.hooks.max_budget_limiter._PROXY_MaxBudgetLimiter object at 0x12cedb830>",
"<litellm.proxy.hooks.cache_control_check._PROXY_CacheControlCheck object at 0x12ca101d0>",
"<litellm._service_logger.ServiceLogging object at 0x13a6d8c50>"
]
```
After:
```shell
$ curl -sSL http://0.0.0.0:4000/health/readiness | jq '.success_callbacks'
[
"langfuse",
"_PROXY_track_cost_callback",
"response_taking_too_long_callback",
"_PROXY_MaxParallelRequestsHandler",
"_PROXY_MaxBudgetLimiter",
"_PROXY_CacheControlCheck",
"ServiceLogging"
]
```
2024-07-10 18:45:42 -07:00
Krrish Dholakia
4ba30abb63
feat(proxy_server.py): working /v1/messages
endpoint
...
Works with claude engineer
2024-07-10 18:15:38 -07:00
Marc Abramowitz
2ce0edcca9
Move JSX stuff so first line of file is heading
...
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
b612488b34
add /files/{file_id}/content as openai route
2024-07-10 16:55:09 -07:00
Ishaan Jaff
cc4434192f
fix test routes on litellm proxy
2024-07-10 16:51:47 -07:00
Ishaan Jaff
4675983f42
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
...
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Ishaan Jaff
fd270166fd
add file content
2024-07-10 16:15:43 -07:00
Ishaan Jaff
c22da9ab0d
add file delete path
2024-07-10 16:08:58 -07:00
Marc Abramowitz
be4b7629b5
Proxy: Add x-litellm-call-id
response header
...
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.
For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:
```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```
then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e
They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Ishaan Jaff
9e71006d07
add /v1/files LIST
2024-07-10 16:04:07 -07:00
Marc Abramowitz
416bca4a3f
Remove unnecessary imports
...
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
0f39e09913
add "/v1/files/{file_id}" as openai route
2024-07-10 14:56:53 -07:00
Ishaan Jaff
f118123ae1
add /files endpoints
2024-07-10 14:55:10 -07:00
Krrish Dholakia
3f4f5ae994
fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key
2024-07-10 12:25:31 -07:00
Ishaan Jaff
3ae7dcccb5
add "/v1/assistants/{assistant_id}", as openai route
2024-07-10 11:42:02 -07:00
Ishaan Jaff
a9e15dad62
feat - add DELETE assistants endpoint
2024-07-10 11:37:37 -07:00
Krrish Dholakia
01a335b4c3
feat(anthropic_adapter.py): support for translating anthropic params to openai format
2024-07-10 00:32:28 -07:00
Krrish Dholakia
d66a48b3d1
style(litellm_license.py): add debug statement for litellm license
2024-07-09 22:43:33 -07:00
Ishaan Jaff
49f8894dcc
fix show exact prisma exception when starting proxy
2024-07-09 18:20:09 -07:00
Ishaan Jaff
eb43343643
ui new build
2024-07-09 16:33:00 -07:00
Ishaan Jaff
a784b54245
feat - add mgtm endpoint routes
2024-07-09 15:29:41 -07:00
Ishaan Jaff
41e7f96c80
ui - add Create, get, delete endpoints for IP Addresses
2024-07-09 15:12:08 -07:00
Krrish Dholakia
789d2dab15
fix(vertex_httpx.py): add sync vertex image gen support
...
Fixes https://github.com/BerriAI/litellm/issues/4623
2024-07-09 13:33:54 -07:00
Ishaan Jaff
6000687601
Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
...
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00
Ishaan Jaff
a7a6567da4
test /threads endpoint
2024-07-09 12:17:42 -07:00
Ishaan Jaff
6891b29444
fix - use helper to check if a route is openai route
2024-07-09 12:00:07 -07:00
Ishaan Jaff
c8a15ab83e
add helper to check is_openai_route
2024-07-09 11:50:12 -07:00
Ishaan Jaff
b6e9fe384c
fix add assistant settings on config
2024-07-09 10:05:32 -07:00
Ishaan Jaff
bce7b5f8c8
feat - support /create assistants endpoint
2024-07-09 10:03:47 -07:00
Ishaan Jaff
0f43869706
feat - support acreate_assistants endpoint
2024-07-09 09:49:38 -07:00
Krrish Dholakia
932814aa1c
docs(configs.md): add ip address filtering to docs
2024-07-08 21:59:26 -07:00
Krish Dholakia
4bd65aac40
Merge pull request #4615 from BerriAI/litellm_user_api_key_auth
...
Enable `allowed_ip's` for proxy
2024-07-08 17:35:11 -07:00
Krrish Dholakia
aecbc98b9b
fix(proxy_cli.py): bump default azure api version
2024-07-08 16:28:22 -07:00
Ishaan Jaff
519a083e09
ui new build
2024-07-08 16:19:25 -07:00
Krrish Dholakia
0ecf94d32e
fix(proxy_server.py): add license protection for 'allowed_ip' address feature
2024-07-08 16:04:44 -07:00
Krrish Dholakia
f982e93d24
feat(user_api_key_auth.py): allow restricting calls by IP address
...
Allows admin to restrict which IP addresses can make calls to the proxy
2024-07-08 15:58:15 -07:00
Ishaan Jaff
449dd4c78c
Merge pull request #4611 from BerriAI/litellm_fix_assistants_routes
...
[Proxy-Fix]: Add /assistants, /threads as OpenAI routes
2024-07-08 15:16:11 -07:00
Ishaan Jaff
f837e6e8d1
fix routes on assistants endpoints
2024-07-08 15:02:12 -07:00
Andres Guzman
e0ac480584
fix(utils.py): change update to upsert
2024-07-08 15:49:29 -06:00
Ishaan Jaff
170150e02e
use ProxyErrorTypes,
2024-07-08 12:47:53 -07:00