LiteLLM mirror
ai-gateway
anthropic
azure-openai
bedrock
gateway
langchain
llm
llm-gateway
llmops
openai
openai-proxy
vertex-ai
Updated 2025-04-24 10:14:52 +00:00
This project demonstrates how to create an LLM pipeline calling IBM Watsonx.ai LLMs hosted on-prem and create a FastAPI server, which can be deployed as abackend API (on IBM Code Engine or any container execution engine or a VM). Then an OpenAPI spec can be created which can be used to create the skill/skill flow in Watsonx Orchestrate to perform certains tasks, such as, in this case, processing claims and retrieving named entities, creating summary and recommending next actions.
Updated 2025-01-31 15:01:39 +00:00