llama-stack/llama_stack/providers/remote/inference
Sarthak Deshpande 921f8b1125
chore: Together async client (#1510)
# What does this PR do?
Uses together async client instead of sync client

[//]: # (If resolving an issue, uncomment and update the line below)

## Test Plan
Command to run the test is in the image below(2 tests fail, and they
were failing for the old stable version as well with the same errors.)
<img width="1689" alt="image"
src="https://github.com/user-attachments/assets/503db720-5379-425d-9844-0225010e41a1"
/>


[//]: # (## Documentation)

---------

Co-authored-by: sarthakdeshpande <sarthak.deshpande@engati.com>
2025-03-10 15:25:01 -07:00
..
anthropic feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
bedrock fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
cerebras fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
databricks fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
fireworks fix: Use re-entrancy and concurrency safe context managers for provider data (#1498) 2025-03-08 22:56:30 -08:00
gemini feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
groq fix: register provider model name and HF alias in run.yaml (#1304) 2025-02-27 16:39:23 -08:00
nvidia fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
ollama feat(logging): implement category-based logging (#1362) 2025-03-07 11:34:30 -08:00
openai feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
passthrough fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
runpod fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
sambanova fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
sample build: format codebase imports using ruff linter (#1028) 2025-02-13 10:06:21 -08:00
tgi fix: solve ruff B008 warnings (#1444) 2025-03-06 16:48:35 -08:00
together chore: Together async client (#1510) 2025-03-10 15:25:01 -07:00
vllm fix: Swap to AsyncOpenAI client in remote vllm provider (#1459) 2025-03-07 14:48:00 -05:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00