[inference] Add a TGI adapter (#52)

* TGI adapter and some refactoring of other inference adapters

* Use the lower-level `generate_stream()` method for correct tool calling

---------

Co-authored-by: Ashwin Bharambe <ashwin@meta.com>
This commit is contained in:
Ashwin Bharambe 2024-09-04 22:49:33 -07:00 committed by GitHub
parent 6ad7365676
commit 21bedc1596
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
3 changed files with 256 additions and 0 deletions

View file

@ -0,0 +1,15 @@
# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
from llama_toolchain.core.datatypes import RemoteProviderConfig
async def get_adapter_impl(config: RemoteProviderConfig, _deps):
from .tgi import TGIInferenceAdapter
impl = TGIInferenceAdapter(config.url)
await impl.initialize()
return impl