Krrish Dholakia
e990c70beb
fix(ollama.py): fix returned error message for streaming error
2024-01-08 23:52:57 +05:30
Krrish Dholakia
4905929de3
refactor: add black formatting
2023-12-25 14:11:20 +05:30
Krrish Dholakia
eaaad79823
feat(ollama.py): add support for async ollama embeddings
2023-12-23 18:01:25 +05:30
Krrish Dholakia
eb2d13e2fb
test(test_completion.py-+-test_streaming.py): add ollama endpoint to ci/cd pipeline
2023-12-22 12:21:33 +05:30
Krrish Dholakia
57607f111a
fix(ollama.py): use litellm.request timeout for async call timeout
2023-12-22 11:22:24 +05:30
Krrish Dholakia
f0df28362a
feat(ollama.py): add support for ollama function calling
2023-12-20 14:59:55 +05:30
ishaan-jaff
9995229b97
(fix) proxy + ollama - raise exception correctly
2023-12-19 18:48:34 +05:30
Joel Eriksson
e214e6ab47
Fix bug when iterating over lines in ollama response
...
async for line in resp.content.iter_any() will return
incomplete lines when the lines are long, and that
results in an exception being thrown by json.loads()
when it tries to parse the incomplete JSON
The default behavior of the stream reader for aiohttp
response objects is to iterate over lines, so just
removing .iter_any() fixes the bug
2023-12-17 20:23:26 +02:00
Krrish Dholakia
a3c7a340a5
fix(ollama.py): fix sync ollama streaming
2023-12-16 21:23:21 -08:00
Krrish Dholakia
4e828ff541
fix(health.md): add background health check details to docs
2023-12-16 10:31:59 -08:00
Krrish Dholakia
4791dda66f
feat(proxy_server.py): enable infinite retries on rate limited requests
2023-12-15 20:03:41 -08:00
Krrish Dholakia
cab870f73a
fix(ollama.py): fix ollama async streaming for /completions calls
2023-12-15 09:28:32 -08:00
Krish Dholakia
a6e78497b5
Merge pull request #1122 from emsi/main
...
Fix #1119 , no content when streaming.
2023-12-14 10:01:00 -08:00
Krrish Dholakia
7b8851cce5
fix(ollama.py): fix async completion calls for ollama
2023-12-13 13:10:25 -08:00
Mariusz Woloszyn
1feb6317f6
Fix #1119 , no content when streaming.
2023-12-13 21:42:35 +01:00
Krrish Dholakia
8e7116635f
fix(ollama.py): add support for async streaming
2023-12-12 16:44:20 -08:00
ishaan-jaff
99b48eff17
(fix) tkinter import
2023-12-12 12:18:25 -08:00
Krrish Dholakia
2c1c75fdf0
fix(ollama.py): enable parallel ollama completion calls
2023-12-11 23:18:37 -08:00
ishaan-jaff
e82b8ed7e2
(feat) debug ollama POST request
2023-11-14 17:53:48 -08:00
Krrish Dholakia
ae35c13015
refactor(ai21,-aleph-alpha,-ollama): making ai21, aleph-alpha, ollama compatible with openai v1 sdk
2023-11-11 17:49:13 -08:00
ishaan-jaff
2f07460333
(feat) completion ollama raise exception when ollama resp != 200
2023-11-10 08:54:05 -08:00
Krrish Dholakia
6b40546e59
refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions
2023-11-04 12:50:15 -07:00
ishaan-jaff
7b3ee8d129
(feat) ollama raise Exceptions + use LiteLLM stream wrapper
2023-10-11 17:00:39 -07:00
Krrish Dholakia
306a38880d
feat(ollama.py): exposing ollama config
2023-10-06 15:52:58 -07:00
Krrish Dholakia
a72880925c
push cli tool
2023-09-26 13:30:47 -07:00
ishaan-jaff
2b9e3434ff
fix async import error
2023-09-21 11:16:50 -07:00
ishaan-jaff
ac90c5286f
conditional import async_generator
2023-09-21 11:09:57 -07:00
ishaan-jaff
35bb6f5a50
support acompletion + stream for ollama
2023-09-21 10:39:48 -07:00
ishaan-jaff
56bd8c1c52
olla upgrades, fix streaming, add non streaming resp
2023-09-09 14:07:13 -07:00