Krrish Dholakia
|
8c7d62e62d
|
fix(utils.py): fix non_default_param pop error for ollama
|
2023-12-21 06:59:13 +05:30 |
|
Krrish Dholakia
|
77b11daf28
|
fix(utils.py): add support for anyscale function calling
|
2023-12-20 17:48:33 +05:30 |
|
Krrish Dholakia
|
23d0278739
|
feat(azure.py): add support for azure image generations endpoint
|
2023-12-20 16:37:21 +05:30 |
|
Krrish Dholakia
|
636ac9b605
|
feat(ollama.py): add support for ollama function calling
|
2023-12-20 14:59:55 +05:30 |
|
Krrish Dholakia
|
b0300392b9
|
fix(utils.py): vertex ai exception mapping
|
2023-12-19 15:25:29 +00:00 |
|
Krrish Dholakia
|
40a9d62de9
|
fix(ollama.py): raise async errors
|
2023-12-19 15:01:12 +00:00 |
|
ishaan-jaff
|
97df44396b
|
(feat) add open router transforms, models, route
|
2023-12-18 09:55:35 +05:30 |
|
ishaan-jaff
|
d3c1c4bf28
|
(feat) set default openrouter configs
|
2023-12-18 08:55:51 +05:30 |
|
Krrish Dholakia
|
51cb16a015
|
feat(main.py): add support for image generation endpoint
|
2023-12-16 21:07:29 -08:00 |
|
Krrish Dholakia
|
e62327dd92
|
fix(traceloop.py): add additional openllmetry traces
|
2023-12-16 19:21:39 -08:00 |
|
Krrish Dholakia
|
1da7d35218
|
feat(proxy_server.py): enable infinite retries on rate limited requests
|
2023-12-15 20:03:41 -08:00 |
|
Krrish Dholakia
|
3d6ade8f26
|
fix(ollama.py): fix ollama async streaming for /completions calls
|
2023-12-15 09:28:32 -08:00 |
|
ishaan-jaff
|
04825115df
|
(feat) proxy - use async langfuse logger
|
2023-12-15 21:57:12 +05:30 |
|
ishaan-jaff
|
7ce2db8189
|
(fix) make dynamo logger async for proxy
|
2023-12-15 18:52:09 +05:30 |
|
ishaan-jaff
|
30e7b893a3
|
(fix) async + stream +sync logging
|
2023-12-15 18:31:36 +05:30 |
|
ishaan-jaff
|
161e8c3bd9
|
(fix) async+stream logger - building complete resp
|
2023-12-15 18:13:29 +05:30 |
|
ishaan-jaff
|
6ff5833839
|
(feat) dynamo db - log call_type
|
2023-12-15 17:27:48 +05:30 |
|
ishaan-jaff
|
1ca53ef2da
|
(fix) dynamo.py spelling
|
2023-12-15 16:07:42 +05:30 |
|
ishaan-jaff
|
e69ca7dbf4
|
(utils) add dynamoDB logger
|
2023-12-15 15:36:00 +05:30 |
|
ishaan-jaff
|
a52c984b3f
|
(fix) raise openai.NotFoundError
|
2023-12-15 14:03:50 +05:30 |
|
ishaan-jaff
|
f828be7ed7
|
(feat) add openai.NotFoundError exception mapping
|
2023-12-15 13:33:03 +05:30 |
|
ishaan-jaff
|
0530d16595
|
(feat) add openai.NotFoundError
|
2023-12-15 10:18:02 +05:30 |
|
ishaan-jaff
|
218d975a7c
|
(feat) add BadRequestError for Azure
|
2023-12-15 09:53:56 +05:30 |
|
ishaan-jaff
|
972b5b29b7
|
(fix) utils - delete remove_model_id
|
2023-12-15 07:07:53 +05:30 |
|
Krrish Dholakia
|
afe38e4a9c
|
fix(utils.py): improved togetherai exception mapping
|
2023-12-14 15:28:11 -08:00 |
|
Krrish Dholakia
|
e3ec848c10
|
test(test_optional_params.py): unit tests for get_optional_params_embeddings()
|
2023-12-14 14:32:36 -08:00 |
|
Krrish Dholakia
|
bb5b883316
|
fix(main.py): support async streaming for text completions endpoint
|
2023-12-14 13:56:32 -08:00 |
|
ishaan-jaff
|
3e1335b177
|
(feat) control caching for embedding, completion
|
2023-12-14 22:31:04 +05:30 |
|
ishaan-jaff
|
abfc607fb0
|
(test) add ollama to ci/cd
|
2023-12-14 19:42:44 +05:30 |
|
ishaan-jaff
|
95454e5176
|
(feat) mistral - add exception mapping
|
2023-12-14 18:57:39 +05:30 |
|
ishaan-jaff
|
5d24e4241f
|
(feat) mistral - add random_seed, safe_mode params
|
2023-12-14 18:42:00 +05:30 |
|
ishaan-jaff
|
303d9aa286
|
(feat) add mistral api
|
2023-12-14 18:17:48 +05:30 |
|
ishaan-jaff
|
25efd43551
|
(feat) use async_cache for acompletion/aembedding
|
2023-12-14 16:04:45 +05:30 |
|
Krrish Dholakia
|
3bdafd8cfd
|
fix(utils.py): support cache logging for async router calls
|
2023-12-13 19:11:43 -08:00 |
|
Krrish Dholakia
|
853508e8c0
|
fix(utils.py): support caching for embedding + log cache hits
n
n
|
2023-12-13 18:37:30 -08:00 |
|
Krrish Dholakia
|
c673a23769
|
fix(vertex_ai.py): add exception mapping for acompletion calls
|
2023-12-13 16:35:50 -08:00 |
|
Krrish Dholakia
|
2231601d5a
|
fix(ollama.py): fix async completion calls for ollama
|
2023-12-13 13:10:25 -08:00 |
|
Krrish Dholakia
|
13a7f2dd58
|
fix(vertex_ai.py): add support for real async streaming + completion calls
|
2023-12-13 11:53:55 -08:00 |
|
Krrish Dholakia
|
43b160d70d
|
feat(vertex_ai.py): adds support for gemini-pro on vertex ai
|
2023-12-13 10:26:30 -08:00 |
|
Krrish Dholakia
|
56729cc1be
|
fix(utils.py): fix stream chunk builder for sync/async success
|
2023-12-13 07:52:51 -08:00 |
|
Krrish Dholakia
|
1a2eab103a
|
fix(langfuse.py): serialize message for logging
|
2023-12-12 21:41:05 -08:00 |
|
Krrish Dholakia
|
d0e01d7e7a
|
fix(utils.py): flush holding chunk for streaming, on stream end
|
2023-12-12 16:13:31 -08:00 |
|
Krrish Dholakia
|
e396fcb55c
|
fix(main.py): pass user_id + encoding_format for logging + to openai/azure
|
2023-12-12 15:46:44 -08:00 |
|
Krrish Dholakia
|
6be1e0cf6f
|
fix(utils.py): fix logging
n
|
2023-12-12 15:46:36 -08:00 |
|
Krrish Dholakia
|
9bbf1258f8
|
fix(utils.py): more logging
|
2023-12-12 15:46:27 -08:00 |
|
Krrish Dholakia
|
93c7393ae8
|
fix(utils.py): add more logging
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
0f17a65b01
|
fix(utils.py): safe fail complete streaming response
|
2023-12-12 15:46:12 -08:00 |
|
Krrish Dholakia
|
0a3320ed6b
|
fix(utils.py): add more logging
|
2023-12-12 15:46:00 -08:00 |
|
Krrish Dholakia
|
d059d1b101
|
fix(sagemaker.py): debug streaming
|
2023-12-12 15:45:07 -08:00 |
|
ishaan-jaff
|
4dd5922d80
|
(fix) assert streaming response = streamchoices()
|
2023-12-12 00:12:57 -08:00 |
|