Ishaan Jaff
|
a8cf71973d
|
docs mcp docs update
|
2025-03-21 17:21:40 -07:00 |
|
Ishaan Jaff
|
b006e325cc
|
add experimental mcp client
|
2025-03-21 17:07:15 -07:00 |
|
Ishaan Jaff
|
4be6cccdd3
|
doc mcp example
|
2025-03-21 17:06:47 -07:00 |
|
Ishaan Jaff
|
91cbaa48a9
|
litellm mcp bridge docs
|
2025-03-21 17:00:56 -07:00 |
|
Ishaan Jaff
|
0c8415e618
|
add diagram for litellm mcp integration
|
2025-03-21 16:54:55 -07:00 |
|
Ishaan Jaff
|
fa94be9598
|
docs litellm mcp bridge
|
2025-03-21 16:50:30 -07:00 |
|
Ishaan Jaff
|
4eb63832aa
|
clean up
|
2025-03-21 14:39:05 -07:00 |
|
Ishaan Jaff
|
0b021b8334
|
call_openai_tool on MCP client
|
2025-03-21 14:36:32 -07:00 |
|
Ishaan Jaff
|
bde703b90c
|
fix beta caps
|
2025-03-21 10:53:10 -07:00 |
|
Ishaan Jaff
|
7d606f0b75
|
fix llm responses
|
2025-03-21 10:50:55 -07:00 |
|
Ishaan Jaff
|
cd6055cfb0
|
transform_mcp_tool_to_openai_tool
|
2025-03-21 10:49:06 -07:00 |
|
Ishaan Jaff
|
6e888474e1
|
litellm MCP client 1
|
2025-03-21 10:32:51 -07:00 |
|
Ishaan Jaff
|
039129676c
|
change location of MCP client
|
2025-03-21 10:30:57 -07:00 |
|
Ishaan Jaff
|
50ec2bd5c9
|
basic MCP client structure
|
2025-03-21 10:25:22 -07:00 |
|
Ishaan Jaff
|
ea8ef3a0d6
|
simple MCP interface
|
2025-03-21 10:11:06 -07:00 |
|
Ishaan Jaff
|
b2852676f6
|
Merge pull request #9435 from BerriAI/litellm_mcp_client_interface
[Docs - Draft] LiteLLM x MCP Interface
|
2025-03-21 09:38:38 -07:00 |
|
Ishaan Jaff
|
8d0d85577c
|
docs litellm mcp
|
2025-03-21 09:37:04 -07:00 |
|
Ishaan Jaff
|
2983c9238e
|
docs litellm mcp
|
2025-03-21 09:34:38 -07:00 |
|
Ishaan Jaff
|
941fe82e19
|
docs add central platform team control on MCP
|
2025-03-21 09:33:09 -07:00 |
|
Ishaan Jaff
|
fda69fb190
|
doc litellm MCP client
|
2025-03-21 09:26:03 -07:00 |
|
Ishaan Jaff
|
6feb83eb51
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Krrish Dholakia
|
bc71407c83
|
docs(release_cycle.md): clarify release cycle for stable releases on docs
|
2025-03-20 15:07:14 -07:00 |
|
Ishaan Jaff
|
baf859f580
|
test_prepare_fake_stream_request
|
2025-03-20 14:50:00 -07:00 |
|
Ishaan Jaff
|
375b80fc1a
|
Merge pull request #9422 from BerriAI/dependabot/pip/cookbook/litellm-ollama-docker-image/litellm-1.61.15
build(deps): bump litellm from 1.55.3 to 1.61.15 in /cookbook/litellm-ollama-docker-image
|
2025-03-20 14:42:06 -07:00 |
|
Krrish Dholakia
|
f11b57bad4
|
fix: fix testing
|
2025-03-20 14:37:58 -07:00 |
|
dependabot[bot]
|
c4c3d024e0
|
build(deps): bump litellm in /cookbook/litellm-ollama-docker-image
Bumps [litellm](https://github.com/BerriAI/litellm) from 1.55.3 to 1.61.15.
- [Release notes](https://github.com/BerriAI/litellm/releases)
- [Commits](https://github.com/BerriAI/litellm/commits)
---
updated-dependencies:
- dependency-name: litellm
dependency-type: direct:production
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2025-03-20 21:03:29 +00:00 |
|
Krish Dholakia
|
bc03378fef
|
Merge pull request #9260 from Grizzly-jobs/fix/voyage-ai-token-usage-tracking
fix: VoyageAI `prompt_token` always empty
|
2025-03-20 14:00:51 -07:00 |
|
Krish Dholakia
|
350a0df164
|
Merge pull request #9290 from jollyolliel/patch-2
Update perplexity.md
|
2025-03-20 13:58:45 -07:00 |
|
Ishaan Jaff
|
3088204ac2
|
fix code quality checks
|
2025-03-20 13:57:35 -07:00 |
|
Krish Dholakia
|
9efd541178
|
Merge pull request #9325 from Shiti/main
docs(vertexai): fix typo in required env variables
|
2025-03-20 13:56:49 -07:00 |
|
Krish Dholakia
|
f5f92bf6ae
|
Merge pull request #9366 from JamesGuthrie/jg/vertex-output-dimensionality
fix: VertexAI outputDimensionality configuration
|
2025-03-20 13:55:33 -07:00 |
|
Ishaan Jaff
|
360931ef1b
|
supports_native_streaming
|
2025-03-20 13:52:30 -07:00 |
|
Krrish Dholakia
|
803f723fb9
|
fix(main.py): fix OR import
|
2025-03-20 13:52:28 -07:00 |
|
Krish Dholakia
|
dd135099cc
|
Merge pull request #9369 from graysonchen/feature/add_openrouter_api_base
feat: Add support for custom OPENROUTER_API_BASE via get_secret in co…
|
2025-03-20 13:52:03 -07:00 |
|
Krish Dholakia
|
97ade8ee75
|
Merge pull request #9402 from BerriAI/litellm_openai_file_types
fix(types/utils.py): support openai 'file' message type
|
2025-03-20 13:46:44 -07:00 |
|
Ishaan Jaff
|
0bbaca1a95
|
Merge pull request #9332 from SrilakshmiC/patch-1
docs: Include Phoenix Page into sidebar under observability
|
2025-03-20 13:35:29 -07:00 |
|
Ishaan Jaff
|
4be40d1ba0
|
supports_native_streaming
|
2025-03-20 13:34:57 -07:00 |
|
Ishaan Jaff
|
cfd075d709
|
test_openai_o1_pro_response_api_streaming
|
2025-03-20 13:04:49 -07:00 |
|
Ishaan Jaff
|
6d4cf6581d
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:30:09 -07:00 |
|
Ishaan Jaff
|
435a89dd79
|
transform_responses_api_request
|
2025-03-20 12:28:55 -07:00 |
|
Ishaan Jaff
|
830c51805c
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:25:58 -07:00 |
|
Ishaan Jaff
|
69c70ad80a
|
add AsyncMockResponsesAPIStreamingIterator
|
2025-03-20 12:06:36 -07:00 |
|
Ishaan Jaff
|
6608770e64
|
add fake_stream to llm http handler
|
2025-03-20 09:55:59 -07:00 |
|
Ishaan Jaff
|
1567e52185
|
add should_fake_stream
|
2025-03-20 09:54:26 -07:00 |
|
Ishaan Jaff
|
14c54e44f7
|
bump to openai==1.67.0
|
2025-03-20 09:26:05 -07:00 |
|
Ishaan Jaff
|
da87ec0966
|
test_openai_o1_pro_response_api
|
2025-03-20 09:18:38 -07:00 |
|
Ishaan Jaff
|
d384e79026
|
test_openai_o1_pro_incomplete_response
|
2025-03-20 09:14:59 -07:00 |
|
Ishaan Jaff
|
95f204d05c
|
bump to openai==1.67.0
|
2025-03-20 09:06:13 -07:00 |
|
Krrish Dholakia
|
42cfc10313
|
docs(response_api.md): update docs to use new o1-pro model example
|
2025-03-19 23:28:28 -07:00 |
|
Krrish Dholakia
|
dcbe5df98b
|
fix(types/utils.py): support openai 'file' message type
Closes https://github.com/BerriAI/litellm/issues/9365
|
2025-03-19 23:13:51 -07:00 |
|