Commit graph

98 commits

Author SHA1 Message Date
Krish Dholakia
d57be47b0f
Litellm ruff linting enforcement (#5992)
* ci(config.yml): add a 'check_code_quality' step

Addresses https://github.com/BerriAI/litellm/issues/5991

* ci(config.yml): check why circle ci doesn't pick up this test

* ci(config.yml): fix to run 'check_code_quality' tests

* fix(__init__.py): fix unprotected import

* fix(__init__.py): don't remove unused imports

* build(ruff.toml): update ruff.toml to ignore unused imports

* fix: fix: ruff + pyright - fix linting + type-checking errors

* fix: fix linting errors

* fix(lago.py): fix module init error

* fix: fix linting errors

* ci(config.yml): cd into correct dir for checks

* fix(proxy_server.py): fix linting error

* fix(utils.py): fix bare except

causes ruff linting errors

* fix: ruff - fix remaining linting errors

* fix(clickhouse.py): use standard logging object

* fix(__init__.py): fix unprotected import

* fix: ruff - fix linting errors

* fix: fix linting errors

* ci(config.yml): cleanup code qa step (formatting handled in local_testing)

* fix(_health_endpoints.py): fix ruff linting errors

* ci(config.yml): just use ruff in check_code_quality pipeline for now

* build(custom_guardrail.py): include missing file

* style(embedding_handler.py): fix ruff check
2024-10-01 19:44:20 -04:00
Ishaan Jaff
85acdb9193
[Feat] Add max_completion_tokens param (#5691)
* add max_completion_tokens

* add max_completion_tokens

* add max_completion_tokens support for OpenAI models

* add max_completion_tokens param

* add max_completion_tokens for bedrock converse models

* add test for converse maxTokens

* fix openai o1 param mapping test

* move test optional params

* add max_completion_tokens for anthropic api

* fix conftest

* add max_completion tokens for vertex ai partner models

* add max_completion_tokens for fireworks ai

* add max_completion_tokens for hf rest api

* add test for param mapping

* add param mapping for vertex, gemini + testing

* predibase is the most unstable and unusable llm api in prod, can't handle our ci/cd

* add max_completion_tokens to openai supported params

* fix fireworks ai param mapping
2024-09-14 14:57:01 -07:00
Krrish Dholakia
8c5ff150f6 fix(huggingface_restapi.py): fix tests 2024-08-23 21:40:27 -07:00
Krish Dholakia
f458f565af
Merge pull request #5292 from OgnjenFrancuski/main
Update SSL verification
2024-08-23 20:42:35 -07:00
Krrish Dholakia
874d58fe8a fix(factory.py): support 'add_generation_prompt' field for hf chat templates
Fixes https://github.com/BerriAI/litellm/pull/5178#issuecomment-2306362008
2024-08-23 08:06:21 -07:00
Ognjen Francuski
31aac9a1e4 Update Huggingface provider to utilize the SSL verification through 'SSL_VERIFY' env var or 'litellm.ssl_verify'. 2024-08-20 14:55:12 +02:00
Krrish Dholakia
0cf81eba62 fix(huggingface_restapi.py): support passing 'wait_for_model' param on completion calls 2024-08-09 09:25:19 -07:00
Krrish Dholakia
466dc9f32a fix(huggingface_restapi.py): fix hf embeddings optional param processing 2024-08-09 09:10:56 -07:00
Krrish Dholakia
51ccfa9e77 fix(huggingface_restapi.py): fixes issue where 'wait_for_model' was not being passed as expected 2024-08-09 08:36:35 -07:00
Krrish Dholakia
43958907bf fix(huggingface_restapi.py): fix linting errors 2024-07-30 14:33:08 -07:00
Krrish Dholakia
69afbc6091 feat(huggingface_restapi.py): Support multiple hf embedding types + async hf embeddings
Closes https://github.com/BerriAI/litellm/issues/3261
2024-07-30 13:32:03 -07:00
Krrish Dholakia
6e9f048618 fix: move to using pydantic obj for setting values 2024-07-11 13:18:36 -07:00
Krrish Dholakia
8117af664c fix(huggingface_restapi.py): fix task extraction from model name 2024-05-15 07:28:19 -07:00
Krrish Dholakia
d4d175030f docs(huggingface.md): add text-classification to huggingface docs 2024-05-10 14:39:14 -07:00
Krrish Dholakia
c17f221b89 test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi 2024-05-10 14:07:01 -07:00
Krrish Dholakia
9083d8e490 fix: fix linting errors 2024-05-09 17:55:27 -07:00
Krrish Dholakia
186c0ec77b feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krrish Dholakia
8d66d813c1 fix(huggingface_restapi.py): fix hf streaming issue 2024-03-04 21:16:41 -08:00
Krrish Dholakia
873ddde924 fix(huggingface_restapi.py): fix huggingface streaming error raising 2024-03-04 09:32:41 -08:00
Krrish Dholakia
194c823783 fix(huggingface_restapi.py): return initial hf error 2024-02-24 10:46:59 -08:00
Krrish Dholakia
5f9e141d1e fix(huggingface_restapi.py): return streamed response correctly 2024-02-16 13:25:13 -08:00
Krrish Dholakia
1b844aafdc fix(huggingface_restapi.py): fix hf streaming to raise exceptions 2024-02-15 21:25:12 -08:00
Krrish Dholakia
b1fd0a164b fix(huggingface_restapi.py): support timeouts for huggingface + openai text completions
https://github.com/BerriAI/litellm/issues/1334
2024-01-08 11:40:56 +05:30
Krrish Dholakia
4905929de3 refactor: add black formatting 2023-12-25 14:11:20 +05:30
ishaan-jaff
20b5505476 (feat) show POST request for HF embeddings 2023-12-16 13:09:49 +05:30
Krrish Dholakia
add153d110 fix(huggingface_restapi.py): add support for additional hf embedding formats 2023-12-15 21:02:41 -08:00
Krrish Dholakia
71e64c34cb fix(huggingface_restapi.py): raise better exceptions for unprocessable hf responses 2023-12-05 07:28:21 -08:00
Ori Kotek
e74ac03169
Do not timeout when calling HF through acomplete 2023-11-23 15:56:59 +02:00
Krrish Dholakia
1218121e47 fix(huggingface_restapi.pyu): fix linting errors 2023-11-21 10:05:35 -08:00
Krrish Dholakia
846a32ca87 fix(huggingface_restapi.py): fixing formatting 2023-11-21 09:57:26 -08:00
Krrish Dholakia
6892fd8b51 fix(huggingface_restapi.py): fix huggingface response format 2023-11-21 09:57:26 -08:00
Krrish Dholakia
a89b8f55e3 fix(huggingface_restapi.py): handle generate text output 2023-11-21 09:57:26 -08:00
ishaan-jaff
50f883a2fb (fix) pydantic errors with response.time 2023-11-20 18:28:19 -08:00
Krrish Dholakia
03efc9185e fix(huggingface_restapi.py): async implementation 2023-11-15 16:54:15 -08:00
Krrish Dholakia
bcea28e2e4 fix(utils): fixing exception mapping 2023-11-15 15:51:17 -08:00
Krrish Dholakia
a59494571f fix(huggingface_restapi.py): fix linting errors 2023-11-15 15:34:21 -08:00
Krrish Dholakia
1a705bfbcb refactor(huggingface_restapi.py): moving async completion + streaming to real async calls 2023-11-15 15:14:21 -08:00
ishaan-jaff
f650be4fee (feat) completion debug view HF POST request 2023-11-14 17:57:41 -08:00
Krrish Dholakia
45b6f8b853 refactor: fixing linting issues 2023-11-11 18:52:28 -08:00
Krrish Dholakia
4f42beb9d9 refactor(huggingface,-anthropic,-replicate,-sagemaker): making huggingface, anthropic, replicate, sagemaker compatible openai v1 sdk 2023-11-11 17:38:15 -08:00
Krrish Dholakia
547598a134 refactor(bedrock.py-+-cohere.py): making bedrock and cohere compatible with openai v1 sdk 2023-11-11 17:33:19 -08:00
ishaan-jaff
a404b0fc3b (fix) remove errant print from hf 2023-11-08 11:49:15 -08:00
ishaan-jaff
3c67de7f04 (fix) hf don't fail when logprob is None 2023-11-06 14:22:09 -08:00
Krrish Dholakia
65c01eae23 fix(huggingface_restapi.py): output parsing chat template models 2023-11-06 11:43:12 -08:00
Krrish Dholakia
7c46e85ed6 refactor(bedrock.py): better exception mapping for bedrock + huggingface 2023-11-04 16:12:12 -07:00
Krrish Dholakia
ab54262d37 fix(timeout.py): import errors 2023-11-04 16:05:14 -07:00
Krrish Dholakia
5b3978eff4 fix(main.py): fixing print_verbose 2023-11-04 14:41:34 -07:00
ishaan-jaff
df57e9247a (fix) hf calculating usage non blocking 2023-11-03 18:03:19 -07:00
Krrish Dholakia
4e1885734a refactor(proxy_server.py): print statement showing how to add debug for logs 2023-11-03 17:41:14 -07:00
ishaan-jaff
6fc0c74878 (fix) remove errant print statements 2023-11-03 13:02:52 -07:00