Krish Dholakia
9b7ebb6a7d
build(pyproject.toml): add new dev dependencies - for type checking ( #9631 )
...
* build(pyproject.toml): add new dev dependencies - for type checking
* build: reformat files to fit black
* ci: reformat to fit black
* ci(test-litellm.yml): make tests run clear
* build(pyproject.toml): add ruff
* fix: fix ruff checks
* build(mypy/): fix mypy linting errors
* fix(hashicorp_secret_manager.py): fix passing cert for tls auth
* build(mypy/): resolve all mypy errors
* test: update test
* fix: fix black formatting
* build(pre-commit-config.yaml): use poetry run black
* fix(proxy_server.py): fix linting error
* fix: fix ruff safe representation error
2025-03-29 11:02:13 -07:00
Krish Dholakia
6fd18651d1
Support litellm.api_base
for vertex_ai + gemini/ across completion, embedding, image_generation ( #9516 )
...
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 20s
* test(tests): add unit testing for litellm_proxy integration
* fix(cost_calculator.py): fix tracking cost in sdk when calling proxy
* fix(main.py): respect litellm.api_base on `vertex_ai/` and `gemini/` routes
* fix(main.py): consistently support custom api base across gemini + vertexai on embedding + completion
* feat(vertex_ai/): test
* fix: fix linting error
* test: set api base as None before starting loadtest
2025-03-25 23:46:20 -07:00
James Guthrie
437dbe7246
fix: VertexAI outputDimensionality configuration
...
VertexAI's API documentation [1] is an absolute mess. In it, they
describe the parameter to configure output dimensionality as
`output_dimensionality`. In the API example, they switch to using snake
case `outputDimensionality`, which is the correct variant.
[1]: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/text-embeddings-api#generative-ai-get-text-embedding-drest
2025-03-19 11:07:36 +01:00
Krish Dholakia
88eedb22b9
vertex ai anthropic thinking param support ( #8853 )
...
* fix(vertex_llm_base.py): handle credentials passed in as dictionary
* fix(router.py): support vertex credentials as json dict
* test(test_vertex.py): allows easier testing
mock anthropic thinking response for vertex ai
* test(vertex_ai_partner_models/): don't remove "@" from model
breaks anthropic cost calculation
* test: move testing
* fix: fix linting error
* fix: fix linting error
* fix(vertex_ai_partner_models/main.py): split @ for codestral model
* test: fix test
* fix: fix stripping "@" on mistral models
* fix: fix test
* test: fix test
2025-02-26 21:37:18 -08:00
Krish Dholakia
dfbbf0bde8
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
...
Co-authored-by: Joey Feldberg <joeyfeldberg@users.noreply.github.com>
Co-authored-by: Joey Feldberg <12495578+joeyfeldberg@users.noreply.github.com>
2025-02-07 16:20:28 -08:00
Ishaan Jaff
c7f14e936a
(code quality) run ruff rule to ban unused imports ( #7313 )
...
* remove unused imports
* fix AmazonConverseConfig
* fix test
* fix import
* ruff check fixes
* test fixes
* fix testing
* fix imports
2024-12-19 12:33:42 -08:00
Krish Dholakia
516c2a6a70
Litellm remove circular imports ( #7232 )
...
* fix(utils.py): initial commit to remove circular imports - moves llmproviders to utils.py
* fix(router.py): fix 'litellm.EmbeddingResponse' import from router.py
'
* refactor: fix litellm.ModelResponse import on pass through endpoints
* refactor(litellm_logging.py): fix circular import for custom callbacks literal
* fix(factory.py): fix circular imports inside prompt factory
* fix(cost_calculator.py): fix circular import for 'litellm.Usage'
* fix(proxy_server.py): fix potential circular import with `litellm.Router'
* fix(proxy/utils.py): fix potential circular import in `litellm.Router`
* fix: remove circular imports in 'auth_checks' and 'guardrails/'
* fix(prompt_injection_detection.py): fix router impor t
* fix(vertex_passthrough_logging_handler.py): fix potential circular imports in vertex pass through
* fix(anthropic_pass_through_logging_handler.py): fix potential circular imports
* fix(slack_alerting.py-+-ollama_chat.py): fix modelresponse import
* fix(base.py): fix potential circular import
* fix(handler.py): fix potential circular ref in codestral + cohere handler's
* fix(azure.py): fix potential circular imports
* fix(gpt_transformation.py): fix modelresponse import
* fix(litellm_logging.py): add logging base class - simplify typing
makes it easy for other files to type check the logging obj without introducing circular imports
* fix(azure_ai/embed): fix potential circular import on handler.py
* fix(databricks/): fix potential circular imports in databricks/
* fix(vertex_ai/): fix potential circular imports on vertex ai embeddings
* fix(vertex_ai/image_gen): fix import
* fix(watsonx-+-bedrock): cleanup imports
* refactor(anthropic-pass-through-+-petals): cleanup imports
* refactor(huggingface/): cleanup imports
* fix(ollama-+-clarifai): cleanup circular imports
* fix(openai_like/): fix impor t
* fix(openai_like/): fix embedding handler
cleanup imports
* refactor(openai.py): cleanup imports
* fix(sagemaker/transformation.py): fix import
* ci(config.yml): add circular import test to ci/cd
2024-12-14 16:28:34 -08:00
Ishaan Jaff
21003c4337
Code Quality Improvement - use vertex_ai/
as folder name for vertexAI ( #7166 )
...
* fix rename vertex ai
* run ci/cd again
2024-12-11 00:32:41 -08:00