[inference] Add a TGI adapter (#52)

* TGI adapter and some refactoring of other inference adapters

* Use the lower-level `generate_stream()` method for correct tool calling

---------

Co-authored-by: Ashwin Bharambe <ashwin@meta.com>
This commit is contained in:
Ashwin Bharambe 2024-09-04 22:49:33 -07:00 committed by GitHub
parent 6ad7365676
commit 21bedc1596
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
3 changed files with 256 additions and 0 deletions

View file

@ -35,6 +35,14 @@ def available_inference_providers() -> List[ProviderSpec]:
module="llama_toolchain.inference.adapters.ollama",
),
),
remote_provider_spec(
api=Api.inference,
adapter=AdapterSpec(
adapter_id="tgi",
pip_packages=["text-generation"],
module="llama_toolchain.inference.adapters.tgi",
),
),
remote_provider_spec(
api=Api.inference,
adapter=AdapterSpec(