Ishaan Jaff
|
faa0d38087
|
Merge pull request #2868 from BerriAI/litellm_add_command_r_on_proxy
Add Azure Command-r-plus on litellm proxy
|
2024-04-05 15:13:47 -07:00 |
|
Ishaan Jaff
|
2174b240d8
|
Merge pull request #2861 from BerriAI/litellm_add_azure_command_r_plust
[FEAT] add azure command-r-plus
|
2024-04-05 15:13:35 -07:00 |
|
Ishaan Jaff
|
26f5823c4e
|
docs - use azure command r
|
2024-04-05 14:55:28 -07:00 |
|
Ishaan Jaff
|
8df13306ea
|
docs call command-r through proxy
|
2024-04-05 14:22:29 -07:00 |
|
Krrish Dholakia
|
834f363b6a
|
docs(vertex.md): add safety settings tutorial to docs
|
2024-04-05 13:55:05 -07:00 |
|
Ishaan Jaff
|
22ac95b834
|
docs azure_ai command r
|
2024-04-05 13:50:56 -07:00 |
|
Ishaan Jaff
|
6b9c04618e
|
fix use azure_ai/mistral
|
2024-04-05 10:07:43 -07:00 |
|
Ishaan Jaff
|
ab60d7c8fb
|
docs azure ai command-r plust
|
2024-04-05 09:24:27 -07:00 |
|
Ishaan Jaff
|
cfe358abaa
|
simplify calling azure/commmand-r-plus
|
2024-04-05 09:18:11 -07:00 |
|
Ishaan Jaff
|
b25db0443a
|
docs - using command r on azure
|
2024-04-05 09:04:38 -07:00 |
|
Krish Dholakia
|
4ce8227e70
|
Merge pull request #2841 from Manouchehri/nuke-gemini-1.5-pro-vision
Fix: Remove non-existent gemini-1.5-pro-vision model.
|
2024-04-05 07:03:38 -07:00 |
|
Krrish Dholakia
|
b0d80de14d
|
docs(vertex.md): fix import routes
|
2024-04-04 21:32:44 -07:00 |
|
Krrish Dholakia
|
003cd3b102
|
docs(vertex.md): add tutorial for using vertex ai with gcp service account
|
2024-04-04 21:28:28 -07:00 |
|
Ishaan Jaff
|
d313f5bd61
|
Merge pull request #2847 from themrzmaster/feat/add_command_r_plus
Add command-r-plus
|
2024-04-04 21:06:42 -07:00 |
|
Ishaan Jaff
|
12e5118367
|
Merge pull request #2846 from BerriAI/litellm_docs_delete_cache_keys
docs - `delete` cache keys
|
2024-04-04 14:07:50 -07:00 |
|
Krrish Dholakia
|
4dbb46cf42
|
docs(vertex.md): add docs on setting google_application_credentials
|
2024-04-04 13:49:03 -07:00 |
|
lucca
|
be265fbb15
|
initial
|
2024-04-04 16:58:51 -03:00 |
|
Ishaan Jaff
|
9e9b617934
|
docs - delete cache keys
|
2024-04-04 12:20:14 -07:00 |
|
David Manouchehri
|
6044045b91
|
Fix: Remove non-existent gemini-1.5-pro-vision model.
The gemini-1.5-pro model handles both text and vision.
|
2024-04-04 17:33:08 +00:00 |
|
Krrish Dholakia
|
cbe4aa386b
|
docs(token_auth.md): update links
|
2024-04-03 13:23:30 -07:00 |
|
Krrish Dholakia
|
06b7d2608e
|
docs(token_auth.md): update docs
|
2024-04-03 13:21:25 -07:00 |
|
Ishaan Jaff
|
91269257f2
|
(docs) openai wildcard models
|
2024-04-01 19:53:34 -07:00 |
|
Krrish Dholakia
|
c52819d47c
|
fix(proxy_server.py): don't require scope for team-based jwt access
If team with the client_id exists then it should be allowed to make a request, if it doesn't then as we discussed it should return an error
|
2024-04-01 18:52:00 -07:00 |
|
Krrish Dholakia
|
cdae08f3c3
|
docs(openai.md): fix docs to include example of calling openai on proxy
|
2024-04-01 12:09:22 -07:00 |
|
Krrish Dholakia
|
a917fadf45
|
docs(routing.md): refactor docs to show how to use pre-call checks and fallback across model groups
|
2024-04-01 11:21:27 -07:00 |
|
Ishaan Jaff
|
18fec3ad8e
|
Merge pull request #2779 from DaxServer/update-proxy-dockerfile-branch
fix(docs): Correct Docker pull command in deploy.md
|
2024-04-01 07:10:45 -07:00 |
|
DaxServer
|
28f6caa04c
|
fix(docs): Correct Docker pull command in deploy.md
Corrected the Docker pull command in deploy.md to remove duplicated 'docker pull' command.
|
2024-03-31 20:10:00 +02:00 |
|
DaxServer
|
61b6f8be44
|
docs: Update references to Ollama repository url
Updated references to the Ollama repository URL from https://github.com/jmorganca/ollama to https://github.com/ollama/ollama.
|
2024-03-31 19:35:37 +02:00 |
|
Krrish Dholakia
|
a7aa6fae64
|
docs(deploy.md): fix docs for litlelm-database docker run example
|
2024-03-30 20:08:27 -07:00 |
|
Krish Dholakia
|
2ca303ec0e
|
Merge pull request #2748 from BerriAI/litellm_anthropic_tool_calling_list_parsing_fix
fix(factory.py): parse list in xml tool calling response (anthropic)
|
2024-03-30 11:27:02 -07:00 |
|
Krrish Dholakia
|
4826018756
|
docs(users.md): fix doc for end-user param
|
2024-03-29 21:54:07 -07:00 |
|
Vincelwt
|
1b84dfac91
|
Merge branch 'main' into main
|
2024-03-30 13:21:53 +09:00 |
|
Ishaan Jaff
|
e8ead49d29
|
Merge pull request #2628 from BerriAI/dependabot/npm_and_yarn/docs/my-website/webpack-dev-middleware-5.3.4
build(deps): bump webpack-dev-middleware from 5.3.3 to 5.3.4 in /docs/my-website
|
2024-03-29 16:12:29 -07:00 |
|
Ishaan Jaff
|
2974e0da31
|
Merge pull request #2689 from BerriAI/dependabot/npm_and_yarn/docs/my-website/express-4.19.2
build(deps): bump express from 4.18.2 to 4.19.2 in /docs/my-website
|
2024-03-29 16:12:17 -07:00 |
|
Ishaan Jaff
|
a78ed81cd9
|
(docs) grafana metrics
|
2024-03-29 14:38:37 -07:00 |
|
Ishaan Jaff
|
24570bc075
|
(docs) grafana / prometheus
|
2024-03-29 14:25:45 -07:00 |
|
Ishaan Jaff
|
c2283235a1
|
(docs) /metrics endpoint
|
2024-03-29 13:36:24 -07:00 |
|
Ishaan Jaff
|
ffa29ddfef
|
(docs) cleanup
|
2024-03-29 13:10:26 -07:00 |
|
Krrish Dholakia
|
81b4f47140
|
docs: show how tool calling parsing works + how to get raw model response
|
2024-03-29 11:58:49 -07:00 |
|
Krrish Dholakia
|
d547944556
|
fix(sagemaker.py): support 'model_id' param for sagemaker
allow passing inference component param to sagemaker in the same format as we handle this for bedrock
|
2024-03-29 08:43:17 -07:00 |
|
Krrish Dholakia
|
cdb940d504
|
docs(prod.md): update prod docs with batch writing info
|
2024-03-28 23:42:43 -07:00 |
|
Krrish Dholakia
|
85a5291142
|
docs(prod.md): doc improvements
|
2024-03-28 19:04:24 -07:00 |
|
Krrish Dholakia
|
da7a00d6d2
|
docs(prod.md): fix docker run commands
|
2024-03-28 18:51:53 -07:00 |
|
Krrish Dholakia
|
7c44b32cc2
|
refactor(proxy/utils.py): add more debug logs
|
2024-03-28 18:44:35 -07:00 |
|
Krrish Dholakia
|
eb318afe52
|
docs(prod.md): cleanup doc
|
2024-03-28 18:34:09 -07:00 |
|
Krrish Dholakia
|
ced902f822
|
docs(prod.md): improve docs
|
2024-03-28 15:35:07 -07:00 |
|
Krrish Dholakia
|
eb3806feba
|
docs(prod.md): update docs with litellm spend logs server machine spec
|
2024-03-28 15:26:26 -07:00 |
|
Krrish Dholakia
|
c15df27c1e
|
docs(prod.md): add litellm spend logs server to docs
|
2024-03-28 15:15:10 -07:00 |
|
Krish Dholakia
|
934a9ac2b4
|
Merge pull request #2722 from BerriAI/litellm_db_perf_improvement
feat(proxy/utils.py): enable updating db in a separate server
|
2024-03-28 14:56:14 -07:00 |
|
Krrish Dholakia
|
a09818e72e
|
build(ghcr_deploy.yml): deploy spend logs server docker image
make it easy for user to deploy a separate spend logs server
|
2024-03-28 13:39:52 -07:00 |
|