llama-stack-mirror/llama_stack/providers
Sajikumar JS 6cf6791de1
fix: updated watsonx inference chat apis with new repo changes (#2033)
# What does this PR do?
There are new changes in repo which needs to add some additional
functions to the inference which is fixed. Also need one additional
params to pass some extra arguments to watsonx.ai

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]

[//]: # (## Documentation)

---------

Co-authored-by: Sajikumar JS <sajikumar.js@ibm.com>
2025-04-26 10:17:52 -07:00
..
inline fix: meta ref inference (#2022) 2025-04-24 13:03:35 -07:00
registry feat: Add watsonx inference adapter (#1895) 2025-04-25 11:29:21 -07:00
remote fix: updated watsonx inference chat apis with new repo changes (#2033) 2025-04-26 10:17:52 -07:00
tests refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
utils feat: new system prompt for llama4 (#2031) 2025-04-25 11:29:08 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: add health to all providers through providers endpoint (#1418) 2025-04-14 11:59:36 +02:00