llama-stack-mirror/llama_stack/providers/remote/inference/gemini
Ihar Hrachyshka 1deb95f922 chore: enable pyupgrade fixes
Schema reflection code needed a minor adjustment to handle UnionTypes
and collections.abc.AsyncIterator. (Both are preferred for latest Python
releases.)

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-05-01 17:02:13 -04:00
..
__init__.py chore: enable pyupgrade fixes 2025-05-01 17:02:13 -04:00
config.py chore: enable pyupgrade fixes 2025-05-01 17:02:13 -04:00
gemini.py feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
models.py feat: add (openai, anthropic, gemini) providers via litellm (#1267) 2025-02-25 22:07:33 -08:00