* test: fix import for test
* fix: fix bad error string
* docs: cleanup files docs
* fix(files/main.py): cleanup error string
* style: initial commit with a provider/config pattern for files api
google ai studio files api onboarding
* fix: test
* feat(gemini/files/transformation.py): support gemini files api response transformation
* fix(gemini/files/transformation.py): return file id as gemini uri
allows id to be passed in to chat completion request, just like openai
* feat(llm_http_handler.py): support async route for files api on llm_http_handler
* fix: fix linting errors
* fix: fix model info check
* fix: fix ruff errors
* fix: fix linting errors
* Revert "fix: fix linting errors"
This reverts commit 926a5a527f.
* fix: fix linting errors
* test: fix test
* test: fix tests
* report key alias to aim
* send litellm version to aim
* Update docs
* blacken
* add docs
* Add info part about virtual keys specific guards
* sort guardrails alphabetically
* fix ruff
* build(README.md): initial commit adding a separate folder for additional proxy files. Meant to reduce size of core package
* build(litellm-proxy-extras/): new pip package for storing migration files
allows litellm proxy to use migration files, without adding them to core repo
* build(litellm-proxy-extras/): cleanup pyproject.toml
* build: move prisma migration files inside new proxy extras package
* build(run_migration.py): update script to write to correct folder
* build(proxy_cli.py): load in migration files from litellm-proxy-extras
Closes https://github.com/BerriAI/litellm/issues/9558
* build: add MIT license to litellm-proxy-extras
* test: update test
* fix: fix schema
* bump: version 0.1.0 → 0.1.1
* build(publish-proxy-extras.sh): add script for publishing new proxy-extras version
* build(liccheck.ini): add litellm-proxy-extras to authorized packages
* fix(litellm-proxy-extras/utils.py): move prisma migrate logic inside extra proxy pkg
easier since migrations folder already there
* build(pre-commit-config.yaml): add litellm_proxy_extras to ci tests
* docs(config_settings.md): document new env var
* build(pyproject.toml): bump relevant files when litellm-proxy-extras version changed
* build(pre-commit-config.yaml): run poetry check on litellm-proxy-extras as well
* fix: initial commit for adding provider model discovery to gemini
* feat(gemini/): add model discovery for gemini/ route
* docs(set_keys.md): update docs to show you can check available gemini models as well
* feat(anthropic/): add model discovery for anthropic api key
* feat(xai/): add model discovery for XAI
enables checking what models an xai key can call
* ci: bump ci config yml
* fix(topaz/common_utils.py): fix linting error
* fix: fix linting error for python38