Commit graph

5814 commits

Author SHA1 Message Date
ishaan-jaff
340706565f (fix) add team_id to doc string 2024-01-18 15:23:05 -08:00
ishaan-jaff
cdede8836f (docs)virtual keys 2024-01-18 15:16:40 -08:00
ishaan-jaff
0e3e8050d7 (docs) /key/info 2024-01-18 14:54:11 -08:00
ishaan-jaff
d5e720e161 (docs) /key/update 2024-01-18 14:45:49 -08:00
ishaan-jaff
2b6972111e (feat) write team_id to User Table 2024-01-18 14:42:46 -08:00
ishaan-jaff
5beef6dbcd (test) setting team_id 2024-01-18 14:33:13 -08:00
ishaan-jaff
f405a827e3 (docs) virtual_keys 2024-01-18 13:55:32 -08:00
ishaan-jaff
90509a159a (fix) write team_id to key table 2024-01-18 13:54:08 -08:00
ishaan-jaff
42ad12b2bd (fix) support team_id for /key/generate 2024-01-18 13:48:52 -08:00
ishaan-jaff
ea32a8757b (feat) set team_id on virtual_keys 2024-01-18 13:34:51 -08:00
ishaan-jaff
1c987a436e (docs) virtual_keys 2024-01-18 13:34:33 -08:00
Krrish Dholakia
2e06e00413 bump: version 1.18.1 → 1.18.2 2024-01-18 12:42:33 -08:00
Krrish Dholakia
1ea3833ef7 fix(parallel_request_limiter.py): decrement count for failed llm calls
https://github.com/BerriAI/litellm/issues/1477
2024-01-18 12:42:14 -08:00
Krrish Dholakia
37e6c6a59f docs(sidebars.js): simplify docs nav 2024-01-18 11:38:02 -08:00
Krrish Dholakia
354a0b2497 docs(sidebar.js): add link to all endpoints in sidebar 2024-01-18 11:21:55 -08:00
Krrish Dholakia
c8dd36db9e fix(proxy_server.py): show all models user has access to in /models 2024-01-18 10:56:37 -08:00
Krish Dholakia
658fd4de38
Merge pull request #1495 from puffo/litellm_ollama_chat_fix
fix(ollama_chat.py): use tiktoken as backup for prompt token counting
2024-01-18 10:02:27 -08:00
Ishaan Jaff
143e225194
Merge pull request #1496 from BerriAI/litellm_unit_test_key_endpoints
[Test+Fix] /Key/Info, /Key/Update - Litellm unit test key endpoints
2024-01-18 09:55:30 -08:00
ishaan-jaff
08ee65f894 (test) /key/update, /key/info 2024-01-18 09:35:02 -08:00
ishaan-jaff
fc1eb36f24 (fix) /key/update overwriting metadata 2024-01-18 09:32:56 -08:00
Krrish Dholakia
96122a4f88 fix(proxy/utils.py): fix isoformat to string logic 2024-01-18 09:32:30 -08:00
Ishaan Jaff
7db04afaca
Merge pull request #1494 from duarteocarmo/patch-1
Update s3 cache to support folder
2024-01-18 09:16:19 -08:00
puffo
becff369dc fix(ollama_chat.py): use tiktoken as backup for prompt token counting 2024-01-18 10:47:24 -06:00
Duarte OC
5d0654e6f6 docs 2024-01-18 17:32:42 +01:00
Duarte OC
dbadd64395 revert comment 2024-01-18 17:26:38 +01:00
Krrish Dholakia
76af479dea bump: version 1.18.0 → 1.18.1 2024-01-18 07:49:20 -08:00
Krrish Dholakia
71034099c9 fix(proxy/utils.py): prisma client fix get data to handle list return 2024-01-18 07:49:13 -08:00
ishaan-jaff
e77782b4d3 (feat) Dockerfile bump langfuse 2024-01-18 07:20:42 -08:00
ishaan-jaff
6d99a58c85 (chore) update poetry.lock 2024-01-18 07:19:47 -08:00
Ishaan Jaff
99a05b5599
Merge pull request #1489 from ShaunMaher/litellm_alpine_build_pyyaml_cython_issue
Altered requirements.txt to require pyyaml 6.0.1 which will resolve #1488
2024-01-18 06:58:24 -08:00
Duarte OC
578256a6a2
Update s3 cache to support folder 2024-01-18 11:38:05 +01:00
Shaun Maher
f4ee8e430f Altered requirements.txt to require pyyaml 6.0.1 which will resolve the alpine docker build issue. 2024-01-18 18:52:43 +11:00
ishaan-jaff
85b5395692 (test) use os.environ/ for azure vision enhance 2024-01-17 21:26:47 -08:00
ishaan-jaff
79c412cab5 (feat) set Azure vision enhancement params using os.environ 2024-01-17 21:23:40 -08:00
Krrish Dholakia
a6cd068c49 docs(virtual_keys.md): add model access groups to docs 2024-01-17 20:38:41 -08:00
ishaan-jaff
0414e40d4a (docs) also test gpt-4 vision enhancements 2024-01-17 18:46:41 -08:00
ishaan-jaff
debef7544d (feat) return Azure enahncements used 2024-01-17 18:46:41 -08:00
Krrish Dholakia
f4c5c56638 bump: version 1.17.18 → 1.18.0 2024-01-17 18:35:07 -08:00
Krish Dholakia
e9ac001005
Merge pull request #1483 from BerriAI/litellm_model_access_groups_feature
feat(proxy_server.py): support model access groups
2024-01-17 18:16:53 -08:00
Ishaan Jaff
15ae9182db
Merge pull request #1484 from BerriAI/litellm_access_key_metadata_in_callbacks
[Feat] Proxy - Access Key metadata in callbacks
2024-01-17 18:08:08 -08:00
ishaan-jaff
f3a45ea044 (fix) cleanup 2024-01-17 17:54:18 -08:00
ishaan-jaff
8df3a86178 (feat) proxy - set endpoint called in callback 2024-01-17 17:44:28 -08:00
Krrish Dholakia
73daee7e07 fix(proxy_cli.py): ensure proxy always retries if db push fails to connect to db 2024-01-17 17:37:59 -08:00
ishaan-jaff
5c1ae3d412 (feat) langfuse send metadata as tags 2024-01-17 17:29:46 -08:00
Krrish Dholakia
cff9f7fee6 fix(proxy_server.py): handle empty insert_data response 2024-01-17 17:28:23 -08:00
Krrish Dholakia
08b409bae8 fix(utils.py): fix if check 2024-01-17 17:17:58 -08:00
ishaan-jaff
46f84bec69 (test) api_key metadata available in callback 2024-01-17 16:48:02 -08:00
ishaan-jaff
00dfb5918c (feat) proxy - log key metadata in calback 2024-01-17 16:42:49 -08:00
Krrish Dholakia
6dc39e4ed1 bump: version 1.17.17 → 1.17.18 2024-01-17 15:57:04 -08:00
Krrish Dholakia
7ed4d9b4d1 fix(utils.py): allow dynamically setting boto3 init and switching between bedrock and openai 2024-01-17 15:56:30 -08:00