llama-stack-mirror/src/llama_stack
Ashwin Bharambe 0944b6a734 fix(mypy): use correct OpenAIChatCompletionChunk import in vllm
Import OpenAIChatCompletionChunk from llama_stack.apis.inference
instead of aliasing from openai package to match parent class signature.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 23:14:49 -07:00
..
apis chore!: BREAKING CHANGE: vector_db_id -> vector_store_id (#3923) 2025-10-27 14:26:06 -07:00
cli chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
core fix(mypy): resolve routes and type narrowing errors 2025-10-27 23:08:24 -07:00
distributions chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
models fix(mypy): resolve union type and list annotation errors 2025-10-27 23:04:59 -07:00
providers fix(mypy): use correct OpenAIChatCompletionChunk import in vllm 2025-10-27 23:14:49 -07:00
strong_typing chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
testing fix(mypy): resolve provider utility and testing type issues 2025-10-27 22:53:48 -07:00
ui chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
__init__.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
env.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
log.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00
schema_utils.py chore(package): migrate to src/ layout (#3920) 2025-10-27 12:02:21 -07:00