Ishaan Jaff
|
02fe7cfb3a
|
ui new build
|
2024-07-13 14:38:13 -07:00 |
|
Krrish Dholakia
|
e82616342b
|
test(test_pass_through_endpoints.py): add test for rpm limit support
|
2024-07-13 13:49:20 -07:00 |
|
Krrish Dholakia
|
1d6643df22
|
feat(pass_through_endpoint.py): support enforcing key rpm limits on pass through endpoints
Closes https://github.com/BerriAI/litellm/issues/4698
|
2024-07-13 13:29:44 -07:00 |
|
Ishaan Jaff
|
b7387156c8
|
delete updated / deleted values from cache
|
2024-07-13 13:16:57 -07:00 |
|
Ishaan Jaff
|
f2b0929284
|
correctly clear cache when updating a user
|
2024-07-13 12:33:43 -07:00 |
|
Ishaan Jaff
|
8cce7d2df1
|
use wrapper on /user endpoints
|
2024-07-13 12:29:15 -07:00 |
|
Krrish Dholakia
|
6641683d66
|
feat(guardrails.py): allow setting logging_only in guardrails_config for presidio pii masking integration
|
2024-07-13 12:22:17 -07:00 |
|
Ishaan Jaff
|
2505dcf530
|
correctly flush cache when updating user
|
2024-07-13 12:05:09 -07:00 |
|
Krish Dholakia
|
e628171d82
|
Merge pull request #4686 from BerriAI/litellm_custom_chat_endpoints
docs(pass_through.md): Creating custom chat endpoints on proxy
|
2024-07-13 09:45:17 -07:00 |
|
Ishaan Jaff
|
f373ac90f6
|
Merge pull request #4685 from BerriAI/litellm_return_type_expired_key
[Fix] Proxy Return type=expire_key on expired Key errors
|
2024-07-12 18:52:51 -07:00 |
|
Krrish Dholakia
|
c8a2782df8
|
docs(pass_through.md): add doc on creating custom chat endpoints on proxy
Allows developers to call proxy with anthropic sdk/boto3/etc.
|
2024-07-12 18:48:40 -07:00 |
|
Ishaan Jaff
|
d76a09681f
|
raise roxyErrorTypes.expired_key on expired key
|
2024-07-12 18:41:39 -07:00 |
|
Ishaan Jaff
|
f32ecd2238
|
raise expired_key error
|
2024-07-12 18:39:00 -07:00 |
|
Ishaan Jaff
|
1adff9cbd6
|
Merge pull request #4684 from BerriAI/litellm_safe_memory_mode
[Feat] Allow safe memory mode
|
2024-07-12 18:32:16 -07:00 |
|
Ishaan Jaff
|
fc7b1d78e2
|
Merge pull request #4682 from BerriAI/litellm_mem_leak_debug
show stack trace of 10 files taking up memory
|
2024-07-12 18:31:41 -07:00 |
|
Ishaan Jaff
|
ad93e940fc
|
Merge pull request #4681 from BerriAI/litellm_mem_usage
[Fix] Reduce Mem Usage - only set ttl for requests to 2 mins
|
2024-07-12 18:31:19 -07:00 |
|
Ishaan Jaff
|
c43948545f
|
feat add safe_memory_mode
|
2024-07-12 18:18:39 -07:00 |
|
Krrish Dholakia
|
d2a0977af7
|
feat(opentelemetry.py): support logging call metadata to otel
|
2024-07-12 15:41:34 -07:00 |
|
Ishaan Jaff
|
946d48d286
|
show stack trace of 10 files tking up memory
|
2024-07-12 15:33:03 -07:00 |
|
Ishaan Jaff
|
bc7b3f28b9
|
reduce ttil for update_request_status
|
2024-07-12 15:14:54 -07:00 |
|
Krrish Dholakia
|
d4ba87840e
|
fix(proxy_server.py): fix linting errors
|
2024-07-11 22:12:33 -07:00 |
|
Krish Dholakia
|
35a17b7d99
|
Merge pull request #4669 from BerriAI/litellm_logging_only_masking
Flag for PII masking on Logging only
|
2024-07-11 22:03:37 -07:00 |
|
Krish Dholakia
|
8a4c428a7c
|
Merge branch 'main' into litellm_call_id_in_response
|
2024-07-11 21:54:49 -07:00 |
|
Krish Dholakia
|
12e0f50812
|
Merge pull request #4651 from msabramo/docs-logging-cleanup
Docs: Miscellaneous cleanup of `docs/my-website/docs/proxy/logging.md`
|
2024-07-11 21:52:20 -07:00 |
|
Ishaan Jaff
|
2eef673ca8
|
ui new build
|
2024-07-11 19:13:08 -07:00 |
|
Krrish Dholakia
|
1a57e49e46
|
fix(presidio_pii_masking.py): support logging_only pii masking
|
2024-07-11 18:04:12 -07:00 |
|
Krrish Dholakia
|
abd682323c
|
feat(guardrails): Flag for PII Masking on Logging
Fixes https://github.com/BerriAI/litellm/issues/4580
|
2024-07-11 16:09:34 -07:00 |
|
Ishaan Jaff
|
92228d9104
|
Merge pull request #4647 from msabramo/msabramo/remove-unnecessary-imports
Remove unnecessary imports
|
2024-07-11 15:07:30 -07:00 |
|
Ishaan Jaff
|
bf50c8e087
|
Merge pull request #4661 from BerriAI/litellm_fix_mh
[Fix] Model Hub - Show supports vision correctly
|
2024-07-11 15:03:37 -07:00 |
|
Ishaan Jaff
|
a16cd02cd9
|
fix supports vision
|
2024-07-11 12:59:42 -07:00 |
|
Krrish Dholakia
|
3f965df68b
|
fix(llm_cost_calc/google.py): fix google embedding cost calculation
Fixes https://github.com/BerriAI/litellm/issues/4630
|
2024-07-11 11:55:48 -07:00 |
|
Ishaan Jaff
|
db7d417727
|
Merge pull request #4658 from BerriAI/litellm_check_otel_spans
[Test-Proxy] Otel Traces
|
2024-07-11 10:41:51 -07:00 |
|
Ishaan Jaff
|
daeafc40f9
|
Merge pull request #4652 from msabramo/shorter-success_callbacks-in-health-readiness-response
Shorter success callbacks from `/health/readiness`
|
2024-07-11 09:57:52 -07:00 |
|
Ishaan Jaff
|
ad7ff0d188
|
Merge pull request #4656 from BerriAI/litellm_otel_fix
[Proxy - OTEL] Fix logging DB, Redis Cache Reads
|
2024-07-11 09:55:51 -07:00 |
|
Ishaan Jaff
|
dcaffb2a1e
|
test otel
|
2024-07-11 09:54:23 -07:00 |
|
Krrish Dholakia
|
7b38278e69
|
docs(model_management.md): update docs to clarify calling /model/info
|
2024-07-11 09:47:50 -07:00 |
|
Ishaan Jaff
|
07a54faab1
|
fix add master key in requests
|
2024-07-11 09:05:08 -07:00 |
|
Ishaan Jaff
|
97da1252f1
|
test- otel span recording
|
2024-07-11 08:47:16 -07:00 |
|
Ishaan Jaff
|
7d4407132a
|
add otel in callbacks
|
2024-07-11 07:24:48 -07:00 |
|
Ishaan Jaff
|
90ece85862
|
fix - otel log db / redis calls
|
2024-07-11 07:22:45 -07:00 |
|
Krish Dholakia
|
f4d140efec
|
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
Anthropic `/v1/messages` endpoint support
|
2024-07-10 22:41:53 -07:00 |
|
Krrish Dholakia
|
48be4ce805
|
feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
|
2024-07-10 18:53:54 -07:00 |
|
Marc Abramowitz
|
9d2cfe6933
|
Shorter success callbacks from /health/readiness
Before:
```shell
$ curl -sSL http://0.0.0.0:4000/health/readiness | jq '.success_callbacks'
[
"langfuse",
"<function _PROXY_track_cost_callback at 0x12fc14b80>",
"<bound method SlackAlerting.response_taking_too_long_callback of <litellm.integrations.slack_alerting.SlackAlerting object at 0x12cedb740>>",
"<litellm.proxy.hooks.parallel_request_limiter._PROXY_MaxParallelRequestsHandler object at 0x12cedb8f0>",
"<litellm.proxy.hooks.max_budget_limiter._PROXY_MaxBudgetLimiter object at 0x12cedb830>",
"<litellm.proxy.hooks.cache_control_check._PROXY_CacheControlCheck object at 0x12ca101d0>",
"<litellm._service_logger.ServiceLogging object at 0x13a6d8c50>"
]
```
After:
```shell
$ curl -sSL http://0.0.0.0:4000/health/readiness | jq '.success_callbacks'
[
"langfuse",
"_PROXY_track_cost_callback",
"response_taking_too_long_callback",
"_PROXY_MaxParallelRequestsHandler",
"_PROXY_MaxBudgetLimiter",
"_PROXY_CacheControlCheck",
"ServiceLogging"
]
```
|
2024-07-10 18:45:42 -07:00 |
|
Krrish Dholakia
|
4ba30abb63
|
feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
|
2024-07-10 18:15:38 -07:00 |
|
Marc Abramowitz
|
2ce0edcca9
|
Move JSX stuff so first line of file is heading
This prevents VS Code from displaying a warning about the file not starting with
a heading.
|
2024-07-10 17:02:56 -07:00 |
|
Ishaan Jaff
|
b612488b34
|
add /files/{file_id}/content as openai route
|
2024-07-10 16:55:09 -07:00 |
|
Ishaan Jaff
|
cc4434192f
|
fix test routes on litellm proxy
|
2024-07-10 16:51:47 -07:00 |
|
Ishaan Jaff
|
4675983f42
|
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
[Feat] Add LIST, DELETE, GET `/files`
|
2024-07-10 16:42:05 -07:00 |
|
Ishaan Jaff
|
fd270166fd
|
add file content
|
2024-07-10 16:15:43 -07:00 |
|
Ishaan Jaff
|
c22da9ab0d
|
add file delete path
|
2024-07-10 16:08:58 -07:00 |
|