Merge branch 'main' into add-watsonx-inference-adapter

This commit is contained in:
Sajikumar JS 2025-04-07 16:07:43 +05:30
commit 2bf0ca67cb
2 changed files with 2 additions and 2 deletions

View file

@ -11,7 +11,7 @@
### ✨🎉 Llama 4 Support 🎉✨
We release [Version 0.2.0](https://github.com/meta-llama/llama-stack/releases/tag/v0.2.0) with support for the Llama 4 herd of models released by Meta.
We released [Version 0.2.0](https://github.com/meta-llama/llama-stack/releases/tag/v0.2.0) with support for the Llama 4 herd of models released by Meta.
You can now run Llama 4 models on Llama Stack.

View file

@ -17,7 +17,7 @@ client = LlamaStackAsLibraryClient(
# provider_data is optional, but if you need to pass in any provider specific data, you can do so here.
provider_data={"tavily_search_api_key": os.environ["TAVILY_SEARCH_API_KEY"]},
)
await client.initialize()
client.initialize()
```
This will parse your config and set up any inline implementations and remote clients needed for your implementation.