Lunik
a1be265052
🐛 fix: Ollama vision models call arguments (like : llava)
...
Signed-off-by: Lunik <lunik@tiwabbit.fr>
2024-02-26 17:52:55 +01:00
Krrish Dholakia
220a90527f
fix(ollama.py): support format for ollama
2024-02-06 10:11:52 -08:00
Ishaan Jaff
175c4000da
Merge pull request #1750 from vanpelt/patch-2
...
Re-raise exception in async ollama streaming
2024-02-05 08:12:17 -08:00
Krrish Dholakia
a2bb95be59
refactor(ollama.py): trigger rebuild
2024-02-03 20:23:43 -08:00
Krrish Dholakia
56110188fd
fix(ollama.py): fix api connection error
...
https://github.com/BerriAI/litellm/issues/1735
2024-02-03 20:22:33 -08:00
Chris Van Pelt
547b9beefc
Re-raise exception in async ollama streaming
2024-02-01 16:14:07 -08:00
Krrish Dholakia
635a34b543
fix(utils.py): fix streaming chunks to not return role, unless set
2024-02-01 09:55:56 -08:00
ishaan-jaff
3081dc525a
(feat) litellm.completion - support ollama timeout
2024-01-09 10:34:41 +05:30
Krrish Dholakia
d89a58ec54
fix(ollama.py): use tiktoken as backup for prompt token counting
2024-01-09 09:47:18 +05:30
Krrish Dholakia
79978c44ba
refactor: add black formatting
2023-12-25 14:11:20 +05:30
Krrish Dholakia
b7a7c3a4e5
feat(ollama.py): add support for async ollama embeddings
2023-12-23 18:01:25 +05:30
Krrish Dholakia
a65dfdde94
test(test_completion.py-+-test_streaming.py): add ollama endpoint to ci/cd pipeline
2023-12-22 12:21:33 +05:30
Krrish Dholakia
ae288c97fb
fix(ollama.py): use litellm.request timeout for async call timeout
2023-12-22 11:22:24 +05:30
Krrish Dholakia
636ac9b605
feat(ollama.py): add support for ollama function calling
2023-12-20 14:59:55 +05:30
ishaan-jaff
3c37e0d58b
(fix) proxy + ollama - raise exception correctly
2023-12-19 18:48:34 +05:30
Joel Eriksson
afcc83bb15
Fix bug when iterating over lines in ollama response
...
async for line in resp.content.iter_any() will return
incomplete lines when the lines are long, and that
results in an exception being thrown by json.loads()
when it tries to parse the incomplete JSON
The default behavior of the stream reader for aiohttp
response objects is to iterate over lines, so just
removing .iter_any() fixes the bug
2023-12-17 20:23:26 +02:00
Krrish Dholakia
5f4310f592
fix(ollama.py): fix sync ollama streaming
2023-12-16 21:23:21 -08:00
Krrish Dholakia
87df233a19
fix(health.md): add background health check details to docs
2023-12-16 10:31:59 -08:00
Krrish Dholakia
1da7d35218
feat(proxy_server.py): enable infinite retries on rate limited requests
2023-12-15 20:03:41 -08:00
Krrish Dholakia
3d6ade8f26
fix(ollama.py): fix ollama async streaming for /completions calls
2023-12-15 09:28:32 -08:00
Krish Dholakia
c230fa4cd7
Merge pull request #1122 from emsi/main
...
Fix #1119 , no content when streaming.
2023-12-14 10:01:00 -08:00
Krrish Dholakia
2231601d5a
fix(ollama.py): fix async completion calls for ollama
2023-12-13 13:10:25 -08:00
Mariusz Woloszyn
3b643676d9
Fix #1119 , no content when streaming.
2023-12-13 21:42:35 +01:00
Krrish Dholakia
e452aec9ad
fix(ollama.py): add support for async streaming
2023-12-12 16:44:20 -08:00
ishaan-jaff
eec316f3bb
(fix) tkinter import
2023-12-12 12:18:25 -08:00
Krrish Dholakia
b80a81b419
fix(ollama.py): enable parallel ollama completion calls
2023-12-11 23:18:37 -08:00
ishaan-jaff
d25d4d26bd
(feat) debug ollama POST request
2023-11-14 17:53:48 -08:00
Krrish Dholakia
753c722c9f
refactor(ai21,-aleph-alpha,-ollama): making ai21, aleph-alpha, ollama compatible with openai v1 sdk
2023-11-11 17:49:13 -08:00
ishaan-jaff
6e3654d309
(feat) completion ollama raise exception when ollama resp != 200
2023-11-10 08:54:05 -08:00
Krrish Dholakia
d0b23a2722
refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions
2023-11-04 12:50:15 -07:00
ishaan-jaff
960481a540
(feat) ollama raise Exceptions + use LiteLLM stream wrapper
2023-10-11 17:00:39 -07:00
Krrish Dholakia
37d7837b63
feat(ollama.py): exposing ollama config
2023-10-06 15:52:58 -07:00
Krrish Dholakia
694265798d
push cli tool
2023-09-26 13:30:47 -07:00
ishaan-jaff
ebce57dc2e
fix async import error
2023-09-21 11:16:50 -07:00
ishaan-jaff
6bfde2496c
conditional import async_generator
2023-09-21 11:09:57 -07:00
ishaan-jaff
9219fd90c7
support acompletion + stream for ollama
2023-09-21 10:39:48 -07:00
ishaan-jaff
b8538fea34
olla upgrades, fix streaming, add non streaming resp
2023-09-09 14:07:13 -07:00