Krrish Dholakia
|
21eea28723
|
docs(vertex.md): add mistral api to docs
|
2024-07-27 22:44:15 -07:00 |
|
Ishaan Jaff
|
b2f745f0e2
|
Merge pull request #4926 from BerriAI/litellm_check_max_request_size
Proxy Enterprise - security - check max request size
|
2024-07-27 17:02:12 -07:00 |
|
Krish Dholakia
|
f9eca63433
|
Merge pull request #4919 from yujonglee/fix-canary-dev
Fix Canary error with `docusaurus start`
|
2024-07-27 16:42:49 -07:00 |
|
Krrish Dholakia
|
a7785c624b
|
docs(user_keys.md): improve openai migration docs
|
2024-07-27 16:29:06 -07:00 |
|
Ishaan Jaff
|
15d488c25c
|
docs set max_request_size
|
2024-07-27 16:25:57 -07:00 |
|
Krrish Dholakia
|
1e621f716f
|
docs(debugging.md): cleanup docs
|
2024-07-27 09:28:53 -07:00 |
|
yujonglee
|
48a1a2ba26
|
fix
|
2024-07-27 15:05:17 +09:00 |
|
Krish Dholakia
|
f9c2fec1a6
|
Merge pull request #4918 from BerriAI/litellm_ollama_tool_calling
feat(ollama_chat.py): support ollama tool calling
|
2024-07-26 22:16:58 -07:00 |
|
Krrish Dholakia
|
77fe8f57cf
|
docs(ollama.md): add ollama tool calling to docs
|
2024-07-26 22:12:52 -07:00 |
|
Ishaan Jaff
|
f03769e2a4
|
docs fix link https://models.litellm.ai/
|
2024-07-26 21:35:54 -07:00 |
|
Ishaan Jaff
|
f8b9c7128e
|
docs batches
|
2024-07-26 18:51:13 -07:00 |
|
Ishaan Jaff
|
90648bee60
|
docs batches API
|
2024-07-26 18:50:44 -07:00 |
|
Ishaan Jaff
|
dd37d1d032
|
use correct link on http://localhost:4000
|
2024-07-26 18:42:45 -07:00 |
|
Ishaan Jaff
|
f4048bc890
|
docs batches api
|
2024-07-26 18:41:53 -07:00 |
|
Krrish Dholakia
|
b515d4f441
|
docs(stream.md): add streaming token usage info to docs
Closes https://github.com/BerriAI/litellm/issues/4904
|
2024-07-26 10:51:17 -07:00 |
|
Krrish Dholakia
|
7ca29d987d
|
docs(docusaurus.config.js): add llm model cost map to docs
|
2024-07-26 10:07:47 -07:00 |
|
Krrish Dholakia
|
84482703b8
|
docs(config.md): update wildcard docs
|
2024-07-26 08:59:53 -07:00 |
|
Krish Dholakia
|
35737d04d3
|
Merge pull request #4893 from yujonglee/canary
[Docs] Better search experience with Canary
|
2024-07-26 08:34:54 -07:00 |
|
yujonglee
|
10ffb5a960
|
remove ui shift on reload
|
2024-07-26 22:13:04 +09:00 |
|
yujonglee
|
8a45abb563
|
fix import and add fallback
|
2024-07-26 22:00:48 +09:00 |
|
yujonglee
|
c54f23f936
|
wrap existing search bar
|
2024-07-26 21:46:36 +09:00 |
|
yujonglee
|
3967007595
|
update to latest
|
2024-07-26 21:06:53 +09:00 |
|
Krrish Dholakia
|
afcad9e12c
|
docs(custom_llm_server.md): cleanup docs
|
2024-07-25 22:45:03 -07:00 |
|
Ishaan Jaff
|
079a41fbe1
|
Merge branch 'main' into litellm_proxy_support_all_providers
|
2024-07-25 20:15:37 -07:00 |
|
yujonglee
|
b6bcb7eb3c
|
update lock
|
2024-07-26 12:10:05 +09:00 |
|
Ishaan Jaff
|
9247fc3c64
|
deploy link to using litellm
|
2024-07-25 20:09:49 -07:00 |
|
yujonglee
|
a540c23730
|
improvements
|
2024-07-26 12:08:46 +09:00 |
|
Ishaan Jaff
|
c2e309baf3
|
docs using litellm proxy
|
2024-07-25 20:05:28 -07:00 |
|
Ishaan Jaff
|
646b2d50f9
|
docs -quick start
|
2024-07-25 19:52:53 -07:00 |
|
Ishaan Jaff
|
bb6f72b315
|
add mistral sdk usage
|
2024-07-25 19:47:54 -07:00 |
|
Ishaan Jaff
|
af1cd9e06f
|
docs on pass through support
|
2024-07-25 19:17:20 -07:00 |
|
Krish Dholakia
|
a306b83b2d
|
Merge pull request #4887 from BerriAI/litellm_custom_llm
feat(custom_llm.py): Support Custom LLM Handlers
|
2024-07-25 19:05:29 -07:00 |
|
Ishaan Jaff
|
ff0f21a1f3
|
docs - anthropic
|
2024-07-25 19:02:22 -07:00 |
|
Ishaan Jaff
|
26a9f694e1
|
Merge pull request #4890 from BerriAI/docs_set_routing_strategies
docs - add info about routing strategy on load balancing docs
|
2024-07-25 18:55:51 -07:00 |
|
yujonglee
|
78de9424a7
|
customize
|
2024-07-26 10:41:29 +09:00 |
|
yujonglee
|
823d20101d
|
eject default UI
npm run swizzle @getcanary/docusaurus-pagefind SearchBar -- --eject --danger
|
2024-07-26 10:41:29 +09:00 |
|
yujonglee
|
a9a946a660
|
install canary (default UI)
|
2024-07-26 10:41:29 +09:00 |
|
Krrish Dholakia
|
bd7af04a72
|
feat(proxy_server.py): support custom llm handler on proxy
|
2024-07-25 17:56:34 -07:00 |
|
Krrish Dholakia
|
a2d07cfe64
|
docs(custom_llm_server.md): add calling custom llm server to docs
|
2024-07-25 17:41:19 -07:00 |
|
Ishaan Jaff
|
3814170ae1
|
docs - add info about routing strategy on load balancing docs
|
2024-07-25 17:41:16 -07:00 |
|
Ishaan Jaff
|
3573b47098
|
docs add example on using text to speech models
|
2024-07-25 17:29:28 -07:00 |
|
Krrish Dholakia
|
397451570e
|
docs(enterprise.md): cleanup docs
|
2024-07-25 10:09:02 -07:00 |
|
Krrish Dholakia
|
d91b01a24b
|
docs(enterprise.md): cleanup docs
|
2024-07-25 10:08:40 -07:00 |
|
Krrish Dholakia
|
80800b9ec8
|
docs(caching.md): update caching docs to include ttl info
|
2024-07-25 10:01:47 -07:00 |
|
Ishaan Jaff
|
a92a2ca382
|
docs add mistral api large 2
|
2024-07-24 21:35:34 -07:00 |
|
Ishaan Jaff
|
c08d4ca9ec
|
docs groq models
|
2024-07-24 20:49:28 -07:00 |
|
wslee
|
dd10da4d46
|
add support for friendli dedicated endpoint
|
2024-07-25 11:14:35 +09:00 |
|
Ishaan Jaff
|
fe0b0ddaaa
|
doc example using litellm proxy with groq
|
2024-07-24 14:33:49 -07:00 |
|
Ishaan Jaff
|
e378ab8bc9
|
docs - logging langsmith tags
|
2024-07-24 07:12:36 -07:00 |
|
Krrish Dholakia
|
d5d2ffffdf
|
bump: version 1.41.28 → 1.42.0
|
2024-07-23 21:54:06 -07:00 |
|