mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-23 04:53:14 +00:00
Added Ollama as an inference impl (#20)
* fix non-streaming api in inference server * unit test for inline inference * Added non-streaming ollama inference impl * add streaming support for ollama inference with tests * addressing comments --------- Co-authored-by: Hardik Shah <hjshah@fb.com>
This commit is contained in:
parent
c253c1c9ad
commit
156bfa0e15
9 changed files with 921 additions and 33 deletions
|
@ -13,6 +13,7 @@ hydra-zen
|
|||
json-strong-typing
|
||||
llama-models
|
||||
matplotlib
|
||||
ollama
|
||||
omegaconf
|
||||
pandas
|
||||
Pillow
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue