forked from phoenix-oss/llama-stack-mirror
docs: Add OpenAI, Anthropic, Gemini to API providers table (#1617)
# What does this PR do? These are supported via https://github.com/meta-llama/llama-stack/pull/1267. cc @ashwinb Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
This commit is contained in:
parent
28aade9a27
commit
a1bb7c8d82
1 changed files with 4 additions and 0 deletions
|
@ -51,6 +51,10 @@ Here is a list of the various API providers and available distributions that can
|
|||
| PG Vector | Single Node | | | ✅ | | |
|
||||
| PyTorch ExecuTorch | On-device iOS | ✅ | ✅ | | | |
|
||||
| vLLM | Hosted and Single Node | | ✅ | | | |
|
||||
| OpenAI | Hosted | | ✅ | | | |
|
||||
| Anthropic | Hosted | | ✅ | | | |
|
||||
| Gemini | Hosted | | ✅ | | | |
|
||||
|
||||
|
||||
### Distributions
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue