Commit graph

15087 commits

Author SHA1 Message Date
Ishaan Jaff
79f409b21e
Merge pull request #4656 from BerriAI/litellm_otel_fix
[Proxy - OTEL] Fix logging DB, Redis Cache Reads
2024-07-11 09:55:51 -07:00
Ishaan Jaff
15d35cd62f test otel 2024-07-11 09:54:23 -07:00
Krrish Dholakia
070ab9f469 docs(model_management.md): update docs to clarify calling /model/info 2024-07-11 09:47:50 -07:00
Krrish Dholakia
8fa2cf15ee fix(watsonx.py): fix watson process response
Fixes https://github.com/BerriAI/litellm/issues/4654
2024-07-11 09:34:46 -07:00
Krrish Dholakia
b5a00722bc docs(assistants.md): add openai-compatible assistants api example to docs 2024-07-11 09:20:55 -07:00
Ishaan Jaff
e2c3b2b694 fix add master key in requests 2024-07-11 09:05:08 -07:00
Krrish Dholakia
57607dfc47 test(test_alangfuse.py): fix test to expect correct response object 2024-07-11 09:00:31 -07:00
Ishaan Jaff
42ae3d494d fix otel_test_config 2024-07-11 08:59:33 -07:00
Ishaan Jaff
5cc725078d ci/cd run otel test 2024-07-11 08:53:38 -07:00
Ishaan Jaff
02ab3cb73d test- otel span recording 2024-07-11 08:47:16 -07:00
Ishaan Jaff
cb6ddaf1f9 test - otel spans 2024-07-11 08:01:18 -07:00
Ishaan Jaff
a5b8db7f3a
Merge pull request #4625 from lowjiansheng/js/update-helm-chart-package
Create helm package and move index.yaml file location
2024-07-11 07:32:44 -07:00
Ishaan Jaff
19d993e120 add otel in callbacks 2024-07-11 07:24:48 -07:00
Ishaan Jaff
498d7d4228 fix - otel log db / redis calls 2024-07-11 07:22:45 -07:00
Krrish Dholakia
4b0181c79c docs(anthropic_completion.md): add doc on anthropic /v1/messages endpoint support 2024-07-10 22:56:33 -07:00
Krrish Dholakia
03e15d020e bump: version 1.41.16 → 1.41.17 2024-07-10 22:43:26 -07:00
Krish Dholakia
dacce3d78b
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
22bcbac9ea fix: fix linting error 2024-07-10 22:14:23 -07:00
Krrish Dholakia
1019355527 fix(types/utils.py): fix streaming function name 2024-07-10 21:56:47 -07:00
Marc Abramowitz
434953b38c More updates to health.md docs 2024-07-10 19:08:48 -07:00
Marc Abramowitz
6ccd635216 Update doc: health.md 2024-07-10 19:02:48 -07:00
Krrish Dholakia
31829855c0 feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Marc Abramowitz
3d86c4f515 Shorter success callbacks from /health/readiness
Before:

```shell
$ curl -sSL http://0.0.0.0:4000/health/readiness | jq '.success_callbacks'
[
  "langfuse",
  "<function _PROXY_track_cost_callback at 0x12fc14b80>",
  "<bound method SlackAlerting.response_taking_too_long_callback of <litellm.integrations.slack_alerting.SlackAlerting object at 0x12cedb740>>",
  "<litellm.proxy.hooks.parallel_request_limiter._PROXY_MaxParallelRequestsHandler object at 0x12cedb8f0>",
  "<litellm.proxy.hooks.max_budget_limiter._PROXY_MaxBudgetLimiter object at 0x12cedb830>",
  "<litellm.proxy.hooks.cache_control_check._PROXY_CacheControlCheck object at 0x12ca101d0>",
  "<litellm._service_logger.ServiceLogging object at 0x13a6d8c50>"
]
```

After:

```shell
$ curl -sSL http://0.0.0.0:4000/health/readiness | jq '.success_callbacks'
[
  "langfuse",
  "_PROXY_track_cost_callback",
  "response_taking_too_long_callback",
  "_PROXY_MaxParallelRequestsHandler",
  "_PROXY_MaxBudgetLimiter",
  "_PROXY_CacheControlCheck",
  "ServiceLogging"
]
```
2024-07-10 18:45:42 -07:00
Ishaan Jaff
b2f4cd9b56 bump: version 1.41.15 → 1.41.16 2024-07-10 18:34:30 -07:00
Krrish Dholakia
2f8dbbeb97 feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
2024-07-10 18:15:38 -07:00
Ishaan Jaff
f49837df19 ci/cd run again 2024-07-10 18:06:09 -07:00
Ishaan Jaff
dc5adbb0ca docs - control allowed ip address 2024-07-10 18:03:54 -07:00
Ishaan Jaff
b4a96e81c6 docs gdpr regions cloud 2024-07-10 18:00:16 -07:00
Ishaan Jaff
da73f3abda docs security cloud litellm 2024-07-10 17:54:54 -07:00
Ishaan Jaff
ca76d2fd72 fix test_completion_bedrock_httpx_models 2024-07-10 17:42:40 -07:00
Ishaan Jaff
9590d63a38 docs - Security Measures 2024-07-10 17:40:30 -07:00
Ishaan Jaff
d0a7983a41 fix try / except langfuse deep copy 2024-07-10 17:22:14 -07:00
Marc Abramowitz
45a9920080 Add/remove blank lines to make MD linter happy
This removed a lot of yellow squigglies in my VS Code.
2024-07-10 17:20:23 -07:00
Ishaan Jaff
7efe9beac5 fix test_bedrock_httpx_streaming 2024-07-10 17:14:53 -07:00
Marc Abramowitz
9dadbe52f7 Shorten title
I think it looks better to have a shorter title in the left sidebar
and then describe the longer details in the body of the document.
2024-07-10 17:07:51 -07:00
Marc Abramowitz
3e49cfa6ff Delete manual table of contents
It seems redudant since Docusaurus already generates a table of contents in the
right sidebar.

In fact, the `(BETA) Moderation with Azure Content-Safety` link in the manual
TOC was broken, which shows how easy it is to forget to update the manual TOC
when adding new content or to make a mistake while doing it.
2024-07-10 17:03:47 -07:00
Marc Abramowitz
dd0c07d2a1 Move JSX stuff so first line of file is heading
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
de0dacc42d fix test proxy routes 2024-07-10 16:58:53 -07:00
Ishaan Jaff
7b8670f883 add /files/{file_id}/content as openai route 2024-07-10 16:55:09 -07:00
Ishaan Jaff
265ec00d0f fix test routes on litellm proxy 2024-07-10 16:51:47 -07:00
Ishaan Jaff
0cb4dabaf0
Merge pull request #4642 from BerriAI/litellm_safe_access_slack
[fix] slack alerting reports - add validation for safe access into attributes
2024-07-10 16:46:08 -07:00
Ishaan Jaff
a313174ecb
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Ishaan Jaff
f34e45db93 test - /file endpoints 2024-07-10 16:36:27 -07:00
Marc Abramowitz
982603714e Add docs 2024-07-10 16:32:05 -07:00
Ishaan Jaff
a5f12aba81 add file content 2024-07-10 16:15:43 -07:00
Ishaan Jaff
cb300d30a9 add file delete path 2024-07-10 16:08:58 -07:00
Marc Abramowitz
3a2cb151aa Proxy: Add x-litellm-call-id response header
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.

For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:

```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```

then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e

They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Ishaan Jaff
0b39ec6a8e add /v1/files LIST 2024-07-10 16:04:07 -07:00
Ishaan Jaff
a741586519 test openai files endpoints 2024-07-10 15:54:55 -07:00
Ishaan Jaff
ef3bd2df22 support list files on litellm SDK 2024-07-10 15:51:28 -07:00