Ishaan Jaff
8559bcc252
DB Transaction Queue Health Metrics
2025-04-04 21:16:12 -07:00
Ishaan Jaff
1cdee4b331
Merge branch 'main' into litellm_metrics_pod_lock_manager
2025-04-04 16:33:16 -07:00
Krrish Dholakia
bdad9961e3
docs: cleanup
2025-04-03 22:12:51 -07:00
Krrish Dholakia
abea69352a
docs(document_understanding.md): Fix https://github.com/BerriAI/litellm/issues/9704
2025-04-03 22:12:31 -07:00
Ishaan Jaff
44b34299a8
docs db deadlocks
Read Version from pyproject.toml / read-version (push) Successful in 17s
Helm unit test / unit-test (push) Successful in 21s
2025-04-02 23:14:55 -07:00
Ishaan Jaff
82b8eb79c2
doc update
2025-04-02 23:11:22 -07:00
Ishaan Jaff
5222cce510
Merge branch 'main' into litellm_metrics_pod_lock_manager
2025-04-02 21:04:44 -07:00
Ishaan Jaff
acf920a41a
Merge branch 'main' into litellm_fix_azure_o_series
2025-04-02 20:58:52 -07:00
Ishaan Jaff
3eb6c1f2f7
Merge pull request #9708 from BerriAI/dependabot/npm_and_yarn/docs/my-website/image-size-1.2.1
...
Bump image-size from 1.1.1 to 1.2.1 in /docs/my-website
2025-04-02 20:58:16 -07:00
Ishaan Jaff
7b768ed909
doc fix sso login url
2025-04-02 18:38:33 -07:00
Ishaan Jaff
68ce0b111e
Setup on LiteLLM config
2025-04-02 13:41:16 -07:00
Ishaan Jaff
6ab1eba7b6
doc High Availability Setup
2025-04-02 13:38:49 -07:00
Ishaan Jaff
b48b8366c2
docs new deadlock fixing architecture
2025-04-02 13:24:53 -07:00
Ishaan Jaff
3f52a4df32
docs allowed openai params
2025-04-02 09:08:11 -07:00
Krish Dholakia
053b0e741f
Add Google AI Studio /v1/files
upload API support ( #9645 )
...
Read Version from pyproject.toml / read-version (push) Successful in 16s
Helm unit test / unit-test (push) Successful in 23s
* test: fix import for test
* fix: fix bad error string
* docs: cleanup files docs
* fix(files/main.py): cleanup error string
* style: initial commit with a provider/config pattern for files api
google ai studio files api onboarding
* fix: test
* feat(gemini/files/transformation.py): support gemini files api response transformation
* fix(gemini/files/transformation.py): return file id as gemini uri
allows id to be passed in to chat completion request, just like openai
* feat(llm_http_handler.py): support async route for files api on llm_http_handler
* fix: fix linting errors
* fix: fix model info check
* fix: fix ruff errors
* fix: fix linting errors
* Revert "fix: fix linting errors"
This reverts commit 926a5a527f
.
* fix: fix linting errors
* test: fix test
* test: fix tests
2025-04-02 08:56:58 -07:00
dependabot[bot]
7255c8e94a
Bump image-size from 1.1.1 to 1.2.1 in /docs/my-website
...
Bumps [image-size](https://github.com/image-size/image-size ) from 1.1.1 to 1.2.1.
- [Release notes](https://github.com/image-size/image-size/releases )
- [Commits](https://github.com/image-size/image-size/compare/v1.1.1...v1.2.1 )
---
updated-dependencies:
- dependency-name: image-size
dependency-type: indirect
...
Signed-off-by: dependabot[bot] <support@github.com>
2025-04-02 15:10:45 +00:00
Krrish Dholakia
d32cf141f5
docs: update docs
2025-04-02 07:58:45 -07:00
Tomer Bin
0690f7a3cb
Virtual key based policies in Aim Guardrails ( #9499 )
...
* report key alias to aim
* send litellm version to aim
* Update docs
* blacken
* add docs
* Add info part about virtual keys specific guards
* sort guardrails alphabetically
* fix ruff
2025-04-01 21:57:23 -07:00
Krrish Dholakia
40a792472b
build(enterprise.md): add why enterprise to docs
2025-04-01 11:27:03 -07:00
Krrish Dholakia
b0fa934fe3
docs(anthropic.md): update docs with file message usage
Read Version from pyproject.toml / read-version (push) Successful in 17s
Helm unit test / unit-test (push) Successful in 22s
Publish Prisma Migrations / publish-migrations (push) Failing after 1m2s
2025-03-31 22:58:51 -07:00
Ishaan Jaff
bc5cc51b9d
Merge pull request #9567 from BerriAI/litellm_anthropic_messages_improvements
...
[Refactor] - Expose litellm.messages.acreate() and litellm.messages.create() to make LLM API calls in Anthropic API spec
2025-03-31 20:50:30 -07:00
Ishaan Jaff
f54105faf8
Merge pull request #9562 from KPCOFGS/main
...
Read Version from pyproject.toml / read-version (push) Successful in 16s
Helm unit test / unit-test (push) Successful in 21s
Update all_caches.md
2025-03-31 16:06:44 -07:00
Ishaan Jaff
0719d399a7
Merge pull request #9581 from GabrielLoiseau/main
...
docs(gemini): fix typo
2025-03-31 16:06:10 -07:00
Ishaan Jaff
de9565dccf
Merge pull request #9286 from colesmcintosh/xai-vision-model-docs-update
...
fix(docs): update xAI Grok vision model reference
2025-03-31 15:49:06 -07:00
Ishaan Jaff
b8c0526b98
docs anthropic messages endpoint
2025-03-31 15:28:40 -07:00
Shixian Sheng
63e9ac5d04
Merge branch 'BerriAI:main' into main
2025-03-30 06:53:09 -04:00
Krrish Dholakia
17ad8a0417
docs: cleanup docs
Read Version from pyproject.toml / read-version (push) Successful in 16s
Helm unit test / unit-test (push) Successful in 17s
2025-03-30 00:40:23 -07:00
Krrish Dholakia
69db775e73
docs(vertex.md): update docs to show 'file' message usage
2025-03-30 00:28:45 -07:00
Ishaan Jaff
31082344a4
docs release notes
Read Version from pyproject.toml / read-version (push) Successful in 21s
Helm unit test / unit-test (push) Successful in 43s
2025-03-29 23:08:20 -07:00
Ishaan Jaff
df01337bd8
docs litellm mcp
2025-03-29 22:28:03 -07:00
Ishaan Jaff
cc80370e0c
docs mcp litellm
2025-03-29 21:59:58 -07:00
Ishaan Jaff
366f3a901c
docs mcp
2025-03-29 21:46:18 -07:00
Ishaan Jaff
db12adb3db
docs mcp tools
2025-03-29 20:42:14 -07:00
Ishaan Jaff
2238a5585b
doc fix mcp
2025-03-29 20:35:53 -07:00
Ishaan Jaff
8c2f1c6142
docs 1.65.0-stable
2025-03-29 20:03:05 -07:00
Ishaan Jaff
7673293c9f
docs update stable release
2025-03-29 20:01:00 -07:00
Ishaan Jaff
9b187d89f2
add litellm model name on SLP
2025-03-29 19:55:41 -07:00
Ishaan Jaff
1f10d985fb
docs updates release notes
2025-03-29 19:51:38 -07:00
Ishaan Jaff
e9b743e3c6
docs release notes
2025-03-29 19:50:13 -07:00
Ishaan Jaff
f3b72858e8
docs release notes
2025-03-29 19:08:23 -07:00
Ishaan Jaff
d04cc6c81e
docs release notes
2025-03-29 19:03:55 -07:00
Ishaan Jaff
46e5ebe3c7
docs add mcp graphic to stable release notes
2025-03-29 19:00:26 -07:00
Ishaan Jaff
22f9a93e25
docs stable release notes
2025-03-29 18:21:42 -07:00
Ishaan Jaff
cb83584c0e
add Complete Git Diff
2025-03-29 18:09:54 -07:00
Ishaan Jaff
8ea6caeff0
docs fix release notes
2025-03-29 18:06:36 -07:00
Tan Yong Sheng
abf06013ec
update docs for openwebui ( #9636 )
2025-03-29 17:40:27 -07:00
Krish Dholakia
1604f87663
install prisma migration files - connects litellm proxy to litellm's prisma migration files ( #9637 )
...
* build(README.md): initial commit adding a separate folder for additional proxy files. Meant to reduce size of core package
* build(litellm-proxy-extras/): new pip package for storing migration files
allows litellm proxy to use migration files, without adding them to core repo
* build(litellm-proxy-extras/): cleanup pyproject.toml
* build: move prisma migration files inside new proxy extras package
* build(run_migration.py): update script to write to correct folder
* build(proxy_cli.py): load in migration files from litellm-proxy-extras
Closes https://github.com/BerriAI/litellm/issues/9558
* build: add MIT license to litellm-proxy-extras
* test: update test
* fix: fix schema
* bump: version 0.1.0 → 0.1.1
* build(publish-proxy-extras.sh): add script for publishing new proxy-extras version
* build(liccheck.ini): add litellm-proxy-extras to authorized packages
* fix(litellm-proxy-extras/utils.py): move prisma migrate logic inside extra proxy pkg
easier since migrations folder already there
* build(pre-commit-config.yaml): add litellm_proxy_extras to ci tests
* docs(config_settings.md): document new env var
* build(pyproject.toml): bump relevant files when litellm-proxy-extras version changed
* build(pre-commit-config.yaml): run poetry check on litellm-proxy-extras as well
2025-03-29 15:27:09 -07:00
Krrish Dholakia
2fcfabd66f
docs(bedrock.md): clarify version on docs
2025-03-29 00:31:35 -07:00
Krrish Dholakia
cac60d5091
docs(bedrock.md): add latency optimized inference to docs
2025-03-29 00:31:06 -07:00
Krish Dholakia
5ac61a7572
Add bedrock latency optimized inference support ( #9623 )
...
* fix(converse_transformation.py): add performanceConfig param support on bedrock
Closes https://github.com/BerriAI/litellm/issues/7606
* fix(converse_transformation.py): refactor to use more flexible single getter for params which are separate config blocks
* test(test_main.py): add e2e mock test for bedrock performance config
* build(model_prices_and_context_window.json): add versioned multimodal embedding
* refactor(multimodal_embeddings/): migrate to config pattern
* feat(vertex_ai/multimodalembeddings): calculate usage for multimodal embedding calls
Enables cost calculation for multimodal embeddings
* feat(vertex_ai/multimodalembeddings): get usage object for embedding calls
ensures accurate cost tracking for vertexai multimodal embedding calls
* fix(embedding_handler.py): remove unused imports
* fix: fix linting errors
* fix: handle response api usage calculation
* test(test_vertex_ai_multimodal_embedding_transformation.py): update tests
* test: mark flaky test
* feat(vertex_ai/multimodal_embeddings/transformation.py): support text+image+video input
* docs(vertex.md): document sending text + image to vertex multimodal embeddings
* test: remove incorrect file
* fix(multimodal_embeddings/transformation.py): fix linting error
* style: remove unused import
2025-03-29 00:23:09 -07:00