From b906bad23803bf947573e6dcc6557b9cb95625ce Mon Sep 17 00:00:00 2001 From: Yuan Tang Date: Thu, 13 Mar 2025 18:28:52 -0400 Subject: [PATCH] docs: Add OpenAI, Anthropic, Gemini to inference API providers table (#1622) # What does this PR do? Forgot to update this page as well as part of https://github.com/meta-llama/llama-stack/pull/1617. Signed-off-by: Yuan Tang --- docs/source/index.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/source/index.md b/docs/source/index.md index 0d0508466..af381f9c9 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -61,6 +61,10 @@ A number of "adapters" are available for some popular Inference and Vector Store | Groq | Hosted | | SambaNova | Hosted | | PyTorch ExecuTorch | On-device iOS, Android | +| OpenAI | Hosted | +| Anthropic | Hosted | +| Gemini | Hosted | + **Vector IO API** | **Provider** | **Environments** |