Commit graph

4823 commits

Author SHA1 Message Date
Ankur Garha
d6ed13fa4f doc: updated langfuse ver 1.14 in pip install cmd 2023-12-18 22:56:08 +01:00
ishaan-jaff
bd15c61a65 (feat) OR default transforms=[] 2023-12-18 10:59:09 +05:30
Ishaan Jaff
6f97855925
Merge pull request #1174 from ericmjl/ericmjl-patch-1
Update mistral.md
2023-12-18 10:42:46 +05:30
Eric Ma
8ec7e25c06
Update mistral.md
Eliminated typo with extra quotation mark.
2023-12-17 23:51:02 -05:00
ishaan-jaff
3a97a2817f (fix) default args batch completion 2023-12-18 10:05:44 +05:30
ishaan-jaff
6a0c853ae4 (feat) add open router transforms, models, route 2023-12-18 09:55:35 +05:30
ishaan-jaff
1e57c0c152 (feat) completion set function, function_call default None 2023-12-18 09:54:43 +05:30
ishaan-jaff
6b272076d7 (feat) openrouter set transforms=[] default 2023-12-18 09:16:33 +05:30
ishaan-jaff
b15682bc1f (feat) set default openrouter configs 2023-12-18 08:55:51 +05:30
Joel Eriksson
e214e6ab47 Fix bug when iterating over lines in ollama response
async for line in resp.content.iter_any() will return
incomplete lines when the lines are long, and that
results in an exception being thrown by json.loads()
when it tries to parse the incomplete JSON

The default behavior of the stream reader for aiohttp
response objects is to iterate over lines, so just
removing .iter_any() fixes the bug
2023-12-17 20:23:26 +02:00
Joel Eriksson
a419d59542 Fix for issue that occured when proxying to ollama
In the text_completion() function, it previously threw an exception at:
raw_response = response._hidden_params.get("original_response", None)

Due to response being an coroutine object to an ollama_acompletion call,
so I added an asyncio.iscoroutine() check for the response and handle it
by calling response = asyncio.run(response)

I also had to fix atext_completion(), where init_response was an instance
of TextCompletionResponse.

Since this case was not handled by the if-elif that checks if init_response
is a coroutine, a dict or a ModelResponse instance, response was unbound
which threw an exception on the "return response" line.

Note that a regular pyright based linter detects that response is possibly
unbound, and that the same code pattern is used in multiple other places
in main.py.

I would suggest that you either change these cases:

init_response = await loop.run_in_executor(...
if isinstance(init_response, ...
    response = init_response
elif asyncio.iscoroutine(init_response):
    response = await init_response

To either just:

response = await loop.run_in_executor(
if asyncio.iscoroutine(response):
    response = await response

Or at the very least, include an else statement and set response = init_response,
so that response is never unbound when the code proceeds.
2023-12-17 17:27:47 +02:00
Krish Dholakia
c703fb2f2c
Merge pull request #1162 from nirga/patch-2
clarify the need to set an exporter
2023-12-16 21:31:56 -08:00
Krish Dholakia
f0093fae45
Merge pull request #1163 from ku-suke/patch-1
(fix) curl example on proxy/quick_start
2023-12-16 21:31:02 -08:00
Krrish Dholakia
a3c7a340a5 fix(ollama.py): fix sync ollama streaming 2023-12-16 21:23:21 -08:00
Krrish Dholakia
13d088b72e feat(main.py): add support for image generation endpoint 2023-12-16 21:07:29 -08:00
Krrish Dholakia
7847ae1e23 fix(traceloop.py): add additional openllmetry traces 2023-12-16 19:21:39 -08:00
Yusuke Kawabata
de8a3d5369
fix curl example
Remove unnecessary `,` from curl example
2023-12-17 12:00:06 +09:00
Krrish Dholakia
7c2fad2d57 fix(azure.py): fix azure streaming logging 2023-12-16 18:06:08 -08:00
Krrish Dholakia
3923c389fd build(Dockerfile): fixing build requirements 2023-12-16 17:52:30 -08:00
Krrish Dholakia
50b741f8fa fix(Dockerfile): support mac 2023-12-16 16:01:02 -08:00
Nir Gazit
4dc774ab8e
clarify the need to set an exporter 2023-12-16 22:19:07 +01:00
Krish Dholakia
9660f0e0b1
Merge pull request #1076 from Manouchehri/public-fix-1
Use current Git folder for building Dockerfile
2023-12-16 12:28:09 -08:00
Krish Dholakia
47ba8082df
Merge branch 'main' into public-fix-1 2023-12-16 12:27:58 -08:00
Krrish Dholakia
3291de9e11 fix(proxy_server.py): setup depedencies on server startup 2023-12-16 11:56:11 -08:00
Krrish Dholakia
4e828ff541 fix(health.md): add background health check details to docs 2023-12-16 10:31:59 -08:00
ishaan-jaff
abd7e48dee (ci/cd) run again 2023-12-16 22:34:10 +05:30
Krish Dholakia
50f61bcea4
Merge pull request #1157 from nanowell/patch-1
Update mistral.md
2023-12-16 08:56:39 -08:00
ishaan-jaff
c33cbbe068 (docs) - gemini-pro-visio n 2023-12-16 22:18:30 +05:30
ishaan-jaff
5ee6b87f2e (fix) vertexai - gemini 2023-12-16 22:15:41 +05:30
nanowell
9f762bbd5b
Update mistral.md
Incorrect syntax
2023-12-16 17:51:41 +03:00
ishaan-jaff
8522bb60f3 bump: version 1.15.0 → 1.15.1 2023-12-16 19:23:03 +05:30
ishaan-jaff
cb34c3c3f3 (docs) gemini pro vison 2023-12-16 19:20:57 +05:30
ishaan-jaff
a5fce3b2de (test) gemini vision 2023-12-16 19:16:32 +05:30
ishaan-jaff
6f643a6107 (docs) add gemini-pro-vision 2023-12-16 19:07:36 +05:30
ishaan-jaff
4af13e44df (test) vertex ai: stop running 4 requests / test 2023-12-16 19:01:12 +05:30
ishaan-jaff
e527137bee (test) gemini-pro-vision 2023-12-16 18:58:31 +05:30
ishaan-jaff
764f31c970 (feat) add async, async+stream for gemini 2023-12-16 18:58:12 +05:30
ishaan-jaff
efe8b75200 (fix) use litellm.vertex_vision_models 2023-12-16 18:39:40 +05:30
ishaan-jaff
f3ebfb0517 (test) gemini vision test 2023-12-16 18:38:36 +05:30
ishaan-jaff
0bf29a14e8 init vertex_vision_models 2023-12-16 18:37:00 +05:30
ishaan-jaff
55e2beeaf1 (feat) add gemini pro vision 2023-12-16 18:35:28 +05:30
ishaan-jaff
db188507b9 (test) gemini pro vision 2023-12-16 18:31:55 +05:30
ishaan-jaff
774a725ccb (feat) add vertex ai gemini-pro-vision 2023-12-16 18:31:03 +05:30
ishaan-jaff
7b851a3870 (docs) cache params 2023-12-16 14:45:19 +05:30
ishaan-jaff
ed0b5d29b0 (test) proxy - cache config 2023-12-16 14:45:06 +05:30
ishaan-jaff
6b7d0eada4 (feat) proxy - set cache configs on proxy 2023-12-16 14:44:39 +05:30
ishaan-jaff
80bf99b1af (docs) proxy - advanced caching 2023-12-16 13:52:58 +05:30
ishaan-jaff
a04f43ef38 (docs) proxy: embeddings 2023-12-16 13:29:26 +05:30
ishaan-jaff
cc483ad69d (docs) embedding config yaml 2023-12-16 13:14:26 +05:30
ishaan-jaff
20b5505476 (feat) show POST request for HF embeddings 2023-12-16 13:09:49 +05:30