llama-stack-mirror/llama_stack/providers/utils/inference
ehhuang 549812f51e
feat: implement get chat completions APIs (#2200)
# What does this PR do?
* Provide sqlite implementation of the APIs introduced in
https://github.com/meta-llama/llama-stack/pull/2145.
* Introduced a SqlStore API: llama_stack/providers/utils/sqlstore/api.py
and the first Sqlite implementation
* Pagination support will be added in a future PR.

## Test Plan
Unit test on sql store:
<img width="1005" alt="image"
src="https://github.com/user-attachments/assets/9b8b7ec8-632b-4667-8127-5583426b2e29"
/>


Integration test:
```
INFERENCE_MODEL="llama3.2:3b-instruct-fp16" llama stack build --template ollama --image-type conda --run
```
```
LLAMA_STACK_CONFIG=http://localhost:5001 INFERENCE_MODEL="llama3.2:3b-instruct-fp16" python -m pytest -v tests/integration/inference/test_openai_completion.py --text-model "llama3.2:3b-instruct-fp16" -k 'inference_store and openai'
```
2025-05-21 22:21:52 -07:00
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
embedding_mixin.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
inference_store.py feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00
litellm_openai_mixin.py feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
model_registry.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
openai_compat.py fix: multiple tool calls in remote-vllm chat_completion (#2161) 2025-05-15 11:23:29 -07:00
prompt_adapter.py chore: more mypy fixes (#2029) 2025-05-06 09:52:31 -07:00
stream_utils.py feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00