forked from phoenix-oss/llama-stack-mirror
Make TGI adapter compatible with HF Inference API (#97)
This commit is contained in:
parent
851c30597a
commit
615ed4bfbc
7 changed files with 122 additions and 96 deletions
|
@ -1,6 +1,6 @@
|
|||
name: local-tgi
|
||||
distribution_spec:
|
||||
description: Use TGI (local or with Hugging Face Inference Endpoints for running LLM inference. When using HF Inference Endpoints, you must provide the name of the endpoint).
|
||||
description: Like local, but use a TGI server for running LLM inference.
|
||||
providers:
|
||||
inference: remote::tgi
|
||||
memory: meta-reference
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue