Commit graph

1784 commits

Author SHA1 Message Date
Krrish Dholakia
a8d7ce3657 docs(enterprise.md): cleanup docs 2024-05-17 15:34:50 -07:00
Krrish Dholakia
604a867f90 docs(proxy/enterprise.md): update enterprise docs 2024-05-17 15:34:00 -07:00
Krrish Dholakia
b723e608f6 docs(enterprise.md): add swagger - custom routes + branding to docs 2024-05-17 15:31:02 -07:00
Krrish Dholakia
32a04c59cf fix(anthropic.py): bump default anthropic api version for tool use 2024-05-17 00:41:11 -07:00
Krish Dholakia
d9ce94ae23
Merge pull request #3703 from msabramo/msabramo/make-langsmith-integration-work-with-custom-langsmith
Work with custom `LANGSMITH_BASE_URL`
2024-05-16 22:23:09 -07:00
Krrish Dholakia
c5f0918682 docs(billing.md): update billing tutorial to show how to bill internal teams 2024-05-16 17:49:06 -07:00
Marc Abramowitz
bf4f08ac30 Work with custom LANGSMITH_BASE_URL
This allows working with a custom Langsmith base URL. For example,
I can use this to test against a local Langsmith instance, running on
my laptop with Docker by adding this to the proxy config:

```yaml
litellm_settings:
  success_callback: ["langsmith"]

environment_variables:
  LANGSMITH_BASE_URL: "http://localhost:1984"
  LANGSMITH_PROJECT: "litellm-proxy"
```
2024-05-16 17:13:01 -07:00
Ishaan Jaff
7d4f2aaca3 docs - alerting 2024-05-16 16:59:01 -07:00
Ishaan Jaff
53ae4654c4 docs - setting alert types 2024-05-16 16:57:54 -07:00
Ishaan Jaff
97324800ec
Merge pull request #3694 from BerriAI/litellm_allow_setting_anthropic_beta
[Feat] Support Anthropic `tools-2024-05-16` - Set Custom Anthropic Custom Headers
2024-05-16 15:48:26 -07:00
Krrish Dholakia
b696d47442 docs(billing.md): update lago screenshot 2024-05-16 15:30:33 -07:00
Krrish Dholakia
53ddc9fdbe docs(billing.md): improve proxy billing tutorial 2024-05-16 15:27:23 -07:00
Ishaan Jaff
aa0863dd76 docs - Setting anthropic-beta Header in Requests 2024-05-16 14:55:29 -07:00
Krrish Dholakia
a7b9a03991 docs(billing.md): add tutorial on billing with litellm + lago to docs 2024-05-16 14:13:39 -07:00
Krrish Dholakia
3acb31fa49 docs(lago.md): add lago usage-based billing quick-start to docs 2024-05-16 13:24:04 -07:00
Ishaan Jaff
3351c5f11d add gpt-4o to openai vision docs 2024-05-16 12:43:40 -07:00
Ishaan Jaff
881812d5de
Merge pull request #3543 from kmheckel/main
Updated Ollama cost models to include LLaMa3 and Mistral/Mixtral Instruct series
2024-05-15 20:50:50 -07:00
Krrish Dholakia
1a3b001432 docs(langfuse_integration.md): cleanup docs 2024-05-15 07:37:04 -07:00
Krrish Dholakia
8f3bf584be docs(vertex.md): add gemini 1.5 flash to vertex docs 2024-05-14 22:26:56 -07:00
Krrish Dholakia
9eee2f3889 docs(prod.md): add 'disable load_dotenv' tutorial to docs 2024-05-14 19:13:22 -07:00
Krish Dholakia
b04a8d878a
Revert "Logfire Integration" 2024-05-14 17:38:47 -07:00
Ishaan Jaff
c503c471dc fix - cost tracking api 2024-05-14 16:15:40 -07:00
Ishaan Jaff
a714086461 docs - use discord alerting 2024-05-14 14:43:21 -07:00
Ishaan Jaff
aa1615c757
Merge pull request #3626 from BerriAI/litellm_reset_spend_per_team_api_key
feat - reset spend per team, api_key [Only Master Key]
2024-05-14 11:49:07 -07:00
Ishaan Jaff
fe1f5369ab docs - global/spend/reset 2024-05-14 11:47:43 -07:00
alisalim17
765c382b2a Merge remote-tracking branch 'upstream/main' 2024-05-14 22:32:57 +04:00
Ishaan Jaff
0c8f5e5649
Merge pull request #3266 from antonioloison/litellm_add_disk_cache
[Feature] Add cache to disk
2024-05-14 09:24:01 -07:00
Ishaan Jaff
c4dab4e735 docs - alerting test /health/services 2024-05-14 07:47:33 -07:00
alisalim17
18bf68298f Merge remote-tracking branch 'upstream/main' 2024-05-14 18:42:20 +04:00
Krish Dholakia
f282dfc157
Merge branch 'main' into abramowi/customizable-slack-report-frequency 2024-05-13 22:01:21 -07:00
Ishaan Jaff
fde1029423 docs - slack alerting 2024-05-13 21:30:16 -07:00
Krrish Dholakia
22b9933096 docs(routing.md): add default_fallbacks to routing.md docs 2024-05-13 21:28:15 -07:00
Krrish Dholakia
a1536846fe docs(openai.md): add gpt-4o example 2024-05-13 21:19:09 -07:00
Marc Abramowitz
4645451140 Add info about daily reports to alerting.md 2024-05-13 21:05:59 -07:00
Ishaan Jaff
ffcd6b6a63
Merge pull request #3625 from BerriAI/litellm_router_default_fallbacks
Default routing fallbacks
2024-05-13 20:47:54 -07:00
Krrish Dholakia
4ec3b4d9a8 docs(exception_mapping.md): cleanup docs 2024-05-13 18:17:02 -07:00
Krrish Dholakia
228ed25de5 docs(exception_mapping.md): add watsonx exception mapping to docs 2024-05-13 18:16:05 -07:00
Krrish Dholakia
48779cb341 docs(cost_tracking.md): update response object for spend tracking doc 2024-05-13 17:58:33 -07:00
Marc Abramowitz
94f7353596 Document SLACK_DAILY_REPORT_FREQUENCY
in `docs/my-website/docs/proxy/alerting.md`
2024-05-13 17:10:41 -07:00
Ishaan Jaff
ea9b4dc439
Merge pull request #3619 from BerriAI/litellm_show_spend_reports
[Feat] -  `/global/spend/report`
2024-05-13 16:06:02 -07:00
Ishaan Jaff
d0a5c9b363 docs - spend per team 2024-05-13 15:48:26 -07:00
Ishaan Jaff
1be6ea0c0d
Merge pull request #3603 from alexanderepstein/langfuse_turn_off_messaging
feat(langfuse.py): Allow for individual call message/response redaction
2024-05-13 15:21:41 -07:00
Krrish Dholakia
7fa203c810 docs(input.md): add mistral to input param docs 2024-05-13 13:50:49 -07:00
Ishaan Jaff
3a838934c9 docs - cooldown deployment 2024-05-13 12:50:59 -07:00
Ishaan Jaff
dac8c644fd docs - router show cooldown_time 2024-05-13 12:49:51 -07:00
Ishaan Jaff
514c5737f8
Merge pull request #3587 from BerriAI/litellm_proxy_use_batch_completions_model_csv
[Feat] Use csv values for proxy batch completions (OpenAI Python compatible)
2024-05-13 07:55:12 -07:00
Alex Epstein
3bf2ccc856 feat(langfuse.py): Allow for individual call message/response redaction 2024-05-12 22:38:29 -04:00
Krrish Dholakia
7276c6eb1e docs(token_auth.md): add end user cost tracking to jwt auth docs 2024-05-11 21:28:31 -07:00
Ishaan Jaff
2eb4508204 fix mark (BETA) Azure Content Safety 2024-05-11 17:51:21 -07:00
Ishaan Jaff
25febe41c4 docs - using batch completions with python 2024-05-11 14:37:32 -07:00