Added Ollama as an inference impl (#20)

* fix non-streaming api in inference server

* unit test for inline inference

* Added non-streaming ollama inference impl

* add streaming support for ollama inference with tests

* addressing comments

---------

Co-authored-by: Hardik Shah <hjshah@fb.com>
This commit is contained in:
Hardik Shah 2024-07-31 22:08:37 -07:00 committed by GitHub
parent c253c1c9ad
commit 156bfa0e15
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
9 changed files with 921 additions and 33 deletions

View file

@ -12,6 +12,10 @@ async def get_inference_api_instance(config: InferenceConfig):
from .inference import InferenceImpl
return InferenceImpl(config.impl_config)
elif config.impl_config.impl_type == ImplType.ollama.value:
from .ollama import OllamaInference
return OllamaInference(config.impl_config)
from .client import InferenceClient