Commit graph

50 commits

Author SHA1 Message Date
Peter Muller
da659eb9bc Revert imports changes, update tests to match 2024-07-02 19:09:22 -07:00
Peter Muller
a1853cbc50 Add tests for SageMaker region selection 2024-07-02 15:30:39 -07:00
Peter Muller
eb013f4261 Allow calling SageMaker endpoints from different regions 2024-07-01 16:00:42 -07:00
Krrish Dholakia
5f93cae3ff feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
Krrish Dholakia
b10f03706d fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
2024-04-24 08:06:07 -07:00
yishiyiyuan
8619d29741 🐞 fix: djl vllm support
support vllm response format on sagemaker, which only return one choice.
2024-04-03 11:00:51 +08:00
Krrish Dholakia
271fe184eb fix(sagemaker.py): support model_id consistently. support dynamic args for async calls 2024-03-29 09:05:00 -07:00
Krrish Dholakia
62ac3e1de4 fix(sagemaker.py): support 'model_id' param for sagemaker
allow passing inference component param to sagemaker in the same format as we handle this for bedrock
2024-03-29 08:43:17 -07:00
Krrish Dholakia
abb88e50eb fix(sagemaker.py): fix async sagemaker calls
https://github.com/BerriAI/litellm/issues/2086
2024-02-20 17:20:01 -08:00
Krrish Dholakia
f52b3c5f84 feat(llama_guard.py): allow user to define custom unsafe content categories 2024-02-17 17:42:47 -08:00
Krish Dholakia
eda9fa300e Merge branch 'main' into litellm_aioboto3_sagemaker 2024-02-14 21:46:58 -08:00
Krrish Dholakia
0f1fdef7d5 fix(sagemaker.py): fix token iterator default flag 2024-02-13 21:41:09 -08:00
Krrish Dholakia
36b372ad3d docs(pii_masking.md): fix presidio tutorial 2024-02-13 07:42:27 -08:00
Krrish Dholakia
ed21bc28dc fix(sagemaker.py): use __anext__ 2024-02-12 22:13:35 -08:00
Krrish Dholakia
5de569fcb1 feat(sagemaker.py): aioboto3 streaming support 2024-02-12 21:18:34 -08:00
Krrish Dholakia
23c410a548 feat(sagemaker.py): initial commit of working sagemaker with aioboto3 2024-02-12 17:25:57 -08:00
Krrish Dholakia
402235dc5d fix(utils.py): fix sagemaker async logging for sync streaming
https://github.com/BerriAI/litellm/issues/1592
2024-01-25 12:49:45 -08:00
ishaan-jaff
c0a21575ef v0 add TokenIterator, stream support 2024-01-22 21:49:44 -08:00
ishaan-jaff
66c8eb582c (feat) sagemaker - map status code and message 2024-01-15 21:43:16 -08:00
Krrish Dholakia
35fd28073e fix(sagemaker.py): fix the post-call logging logic 2024-01-06 21:52:58 +05:30
Krrish Dholakia
8188475c16 feat(admin_ui.py): support creating keys on admin ui 2023-12-28 16:59:11 +05:30
Krrish Dholakia
79978c44ba refactor: add black formatting 2023-12-25 14:11:20 +05:30
Krrish Dholakia
3c399c425c fix(sagemaker.py): filter out templated prompt if in model response 2023-12-13 07:43:33 -08:00
Krrish Dholakia
6d1a5089e3 refactor: fix linting errors 2023-12-06 11:46:15 -08:00
Krrish Dholakia
45e9c3eb31 feat(sagemaker.py): support huggingface embedding models 2023-12-06 11:41:38 -08:00
Krrish Dholakia
2e5dc00968 fix(sagemaker.py): prompt templating fixes 2023-12-05 17:47:44 -08:00
Krrish Dholakia
c01b15af17 docs(input.md): add hf_model_name to docs 2023-12-05 16:56:18 -08:00
Krrish Dholakia
fc07598b21 fix(sagemaker.py): bring back llama2 templating for sagemaker 2023-12-05 16:42:19 -08:00
Krrish Dholakia
f9b74e54a3 fix(sagemaker.py): enable passing hf model name for prompt template 2023-12-05 16:31:59 -08:00
Krrish Dholakia
20dab6f636 fix(sagemaker.py): fix meta llama model name for sagemaker custom deployment 2023-12-05 16:23:03 -08:00
Krrish Dholakia
976e1f9f66 fix(sagemaker.py): accept all amazon neuron llama2 models 2023-12-05 16:19:28 -08:00
Krrish Dholakia
c4a3e1e564 fix(sagemaker.py): add support for amazon neuron llama models 2023-12-05 16:18:20 -08:00
Krrish Dholakia
7e42c64cc5 fix(utils.py): support sagemaker llama2 custom endpoints 2023-12-05 16:05:15 -08:00
ishaan-jaff
f61a74f886 (linting) fix 2023-11-27 10:27:51 -08:00
ishaan-jaff
4bd13bc006 (feat) completion:sagemaker - support chat models 2023-11-27 10:11:10 -08:00
ishaan-jaff
bb4116f2ab (feat) completion:sagemaker - better debugging 2023-11-27 09:08:20 -08:00
ishaan-jaff
7bc28f3b1c (fix) pydantic errors with response.time 2023-11-20 18:28:19 -08:00
Krrish Dholakia
4b74ddcb17 refactor: fixing linting issues 2023-11-11 18:52:28 -08:00
Krrish Dholakia
5efa3860da refactor(huggingface,-anthropic,-replicate,-sagemaker): making huggingface, anthropic, replicate, sagemaker compatible openai v1 sdk 2023-11-11 17:38:15 -08:00
ishaan-jaff
194b6263c7 (feat) add model_response.usage.completion_tokens for bedrock, palm, petals, sagemaker 2023-10-27 09:51:50 -07:00
Krrish Dholakia
cc0e4f4f9f fix: fix value error if model returns empty completion 2023-10-10 10:11:40 -07:00
ishaan-jaff
a294438802 (feat) sagemaker auth in completion 2023-10-07 15:27:58 -07:00
Krrish Dholakia
69cdf5347a style(test_completion.py): fix merge conflict 2023-10-05 22:09:38 -07:00
Krrish Dholakia
8c48af11c2 fixes to get optional params 2023-10-02 14:44:11 -07:00
ishaan-jaff
fb1c0545be bump version with bedrock 2023-09-14 14:54:36 -07:00
ishaan-jaff
41e064880f fix sagemaker test 2023-09-14 14:49:46 -07:00
ishaan-jaff
f156733ed3 allow users to set AWS_REGION_NAME 2023-09-04 11:57:22 -07:00
ishaan-jaff
44f44ad5a3 add optional params for llama-2 2023-09-04 11:41:20 -07:00
ishaan-jaff
746001e32a working sagemaker support 2023-09-04 11:30:34 -07:00
ishaan-jaff
022c632ce4 v0 add sagemaker 2023-09-04 11:02:20 -07:00