Ishaan Jaff
|
36440f6b4a
|
Merge pull request #2521 from BerriAI/dependabot/npm_and_yarn/docs/my-website/follow-redirects-1.15.6
build(deps): bump follow-redirects from 1.15.4 to 1.15.6 in /docs/my-website
|
2024-03-19 13:13:55 -07:00 |
|
Krrish Dholakia
|
7c74a0e6e2
|
fix(proxy_server.py): expose disable_spend_logs flag in config general settings
Writing each spend log adds +300ms latency
https://github.com/BerriAI/litellm/issues/1714#issuecomment-1924727281
|
2024-03-19 12:08:37 -07:00 |
|
Krrish Dholakia
|
c63259ebcd
|
docs(deploy.md): update docker start command
|
2024-03-19 07:58:08 -07:00 |
|
Vincelwt
|
1cbfd312fe
|
Merge branch 'main' into main
|
2024-03-19 12:50:04 +09:00 |
|
Ishaan Jaff
|
038c9d5781
|
(docs) litellm + datadog
|
2024-03-18 17:06:00 -07:00 |
|
Ishaan Jaff
|
4a4c322278
|
(docs) easily find call hooks
|
2024-03-18 07:38:29 -07:00 |
|
Krrish Dholakia
|
136a58b84a
|
docs(secret.md): add aws secret manager to docs
|
2024-03-16 18:47:56 -07:00 |
|
Ishaan Jaff
|
27cd012c2d
|
(docs) litellm hel
|
2024-03-16 12:14:35 -07:00 |
|
Ishaan Jaff
|
c17e721278
|
(docs) update litellm helm docs
|
2024-03-16 12:01:38 -07:00 |
|
Krrish Dholakia
|
2c2db9ce89
|
fix(proxy_server.py): bug fix on getting user obj from cache
|
2024-03-16 11:07:38 -07:00 |
|
Krrish Dholakia
|
2d2731c3b5
|
docs(caching.md): add batch redis requests to docs
|
2024-03-15 23:01:08 -07:00 |
|
ishaan-jaff
|
92b198f6c5
|
(docs) litellm + helm chart
|
2024-03-15 17:04:51 -07:00 |
|
Ishaan Jaff
|
8cfa0b64ce
|
Merge pull request #2541 from udit-001/docs/chatlitellm-langfuse
docs(langfuse): add chatlitellm section
|
2024-03-15 16:32:01 -07:00 |
|
ishaan-jaff
|
3108c91ebd
|
(docs) using litellm + helm charts
|
2024-03-15 16:20:26 -07:00 |
|
ishaan-jaff
|
4a33c53619
|
(fix) docs litellm helm chart
|
2024-03-15 16:07:43 -07:00 |
|
Udit
|
4a232f4ab3
|
docs(langfuse): update langfuse casing
|
2024-03-16 03:34:47 +05:30 |
|
Udit
|
b8dbcd7ac3
|
docs(langfuse): update section titles
|
2024-03-16 03:33:14 +05:30 |
|
Udit
|
31fb2d0219
|
docs(langfuse): fix missing litellm import
|
2024-03-16 03:30:01 +05:30 |
|
Udit
|
1220eb3c7a
|
docs(langfuse): update chatlitellm section
|
2024-03-16 03:28:19 +05:30 |
|
Udit
|
acd56b174c
|
docs(langfuse): add chatlitellm section
|
2024-03-16 03:24:07 +05:30 |
|
Krrish Dholakia
|
e033e84720
|
docs(cohere.md): fix model name in cohere docs
|
2024-03-15 10:08:07 -07:00 |
|
Krish Dholakia
|
32ca306123
|
Merge pull request #2535 from BerriAI/litellm_fireworks_ai_support
feat(utils.py): add native fireworks ai support
|
2024-03-15 10:02:53 -07:00 |
|
Krrish Dholakia
|
5edf414a5f
|
docs(audio_transcription.md): add openai sdk usage example to audio transcription docs
|
2024-03-15 09:54:03 -07:00 |
|
Krrish Dholakia
|
fa0c8b7be6
|
docs(fireworks_ai.md): add fireworks ai to docs
|
2024-03-15 09:17:15 -07:00 |
|
ishaan-jaff
|
f07a652148
|
(docs) load test proxy
|
2024-03-15 08:10:45 -07:00 |
|
Ishaan Jaff
|
4c834714ab
|
Merge pull request #2529 from snekkenull/main
(feat) add groq/gemma-7b-it
|
2024-03-15 07:41:44 -07:00 |
|
ishaan-jaff
|
91a47dc17a
|
(docs) how to run a locust load test
|
2024-03-15 07:37:50 -07:00 |
|
USAGI
|
a9634b717c
|
Add groq/gemma-7b-it
|
2024-03-15 11:50:19 +08:00 |
|
Krrish Dholakia
|
6f1eb038bc
|
docs(bedrock.md): adding docs for calling bedrock models on proxy via config.yaml
|
2024-03-14 14:50:13 -07:00 |
|
Krrish Dholakia
|
cb221cc88b
|
docs(caching.md): add redis namespaces to docs
|
2024-03-14 13:38:33 -07:00 |
|
dependabot[bot]
|
913f19e046
|
build(deps): bump follow-redirects in /docs/my-website
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.15.4 to 1.15.6.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.15.4...v1.15.6)
---
updated-dependencies:
- dependency-name: follow-redirects
dependency-type: indirect
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2024-03-14 20:02:59 +00:00 |
|
Krrish Dholakia
|
7876aa2d75
|
fix(parallel_request_limiter.py): handle metadata being none
|
2024-03-14 10:02:41 -07:00 |
|
ishaan-jaff
|
0269cddd55
|
(feat) add claude-3-haiku
|
2024-03-13 20:24:06 -07:00 |
|
Aaron Bach
|
daca0a93ce
|
Update docs
|
2024-03-13 17:37:59 -06:00 |
|
Krrish Dholakia
|
16e3aaced5
|
docs(enterprise.md): add prompt injection detection to docs
|
2024-03-13 12:37:32 -07:00 |
|
Krrish Dholakia
|
dbc7552d15
|
docs: refactor team based logging in docs
|
2024-03-13 12:26:39 -07:00 |
|
Krrish Dholakia
|
b3493269b3
|
fix(proxy_server.py): support checking openai user param
|
2024-03-13 12:00:27 -07:00 |
|
Krrish Dholakia
|
9e692d6cce
|
docs(sidebar.js): add dall e 3 cost tracking to docs
|
2024-03-12 22:26:10 -07:00 |
|
Krrish Dholakia
|
488c4b9939
|
docs(cost_tracking.md): add docs for cost tracking dall e 3 calls
|
2024-03-12 21:35:17 -07:00 |
|
Ishaan Jaff
|
5172fb1de9
|
Merge pull request #2474 from BerriAI/litellm_support_command_r
[New-Model] Cohere/command-r
|
2024-03-12 11:11:56 -07:00 |
|
ishaan-jaff
|
8fabaed543
|
(docs) cohere-comand-r
|
2024-03-12 10:34:08 -07:00 |
|
ishaan-jaff
|
ea83c8c9b0
|
(docs) using azure_text models
|
2024-03-12 09:54:34 -07:00 |
|
Krrish Dholakia
|
10f5f342bd
|
docs(virtual_keys.md): cleanup doc
|
2024-03-12 07:05:55 -07:00 |
|
Krrish Dholakia
|
47424b8c90
|
docs(routing.md): fix routing example on docs
|
2024-03-11 22:17:04 -07:00 |
|
ishaan-jaff
|
e46980c56c
|
(docs) using litellm router
|
2024-03-11 21:18:10 -07:00 |
|
Vince Loewe
|
7c38f992dc
|
Merge branch 'main' into main
|
2024-03-11 12:36:41 +09:00 |
|
Ishaan Jaff
|
a1784284bb
|
Merge pull request #2416 from BerriAI/litellm_use_consistent_port
(docs) LiteLLM Proxy - use port 4000 in examples
|
2024-03-09 16:32:08 -08:00 |
|
ishaan-jaff
|
a004240109
|
(docs) deploy litellm
|
2024-03-09 13:45:07 -08:00 |
|
ishaan-jaff
|
ba271e3e87
|
(docs) deploying litell
|
2024-03-09 13:41:20 -08:00 |
|
ishaan-jaff
|
7ae7e95da1
|
(docs) litellm getting started clarify sdk vs proxy
|
2024-03-09 13:04:52 -08:00 |
|