llama-stack-mirror/llama_stack/providers/utils/inference
Charlie Doern d12f195f56
feat: drop python 3.10 support (#2469)
# What does this PR do?

dropped python3.10, updated pyproject and dependencies, and also removed
some blocks of code with special handling for enum.StrEnum

Closes #2458

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-06-19 12:07:14 +05:30
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
embedding_mixin.py feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
inference_store.py feat: support pagination in inference/responses stores (#2397) 2025-06-16 22:43:35 -07:00
litellm_openai_mixin.py feat: Add suffix to openai_completions (#2449) 2025-06-13 16:06:06 -07:00
model_registry.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
openai_compat.py feat: Add suffix to openai_completions (#2449) 2025-06-13 16:06:06 -07:00
prompt_adapter.py chore: more mypy fixes (#2029) 2025-05-06 09:52:31 -07:00
stream_utils.py feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30