ishaan-jaff
|
3e508ea257
|
(test) text_completion responses
|
2023-11-03 22:14:36 -07:00 |
|
ishaan-jaff
|
cc9f17a1a8
|
(feat) add TextCompletionResponse
|
2023-11-03 22:14:07 -07:00 |
|
ishaan-jaff
|
d4430fc51e
|
(feat) text completion response now OpenAI Object
|
2023-11-03 22:13:52 -07:00 |
|
Krrish Dholakia
|
4c895d2e91
|
test(test_completion.py): cleanup
|
2023-11-03 18:31:41 -07:00 |
|
Krrish Dholakia
|
1c4dd0671b
|
fix(bedrock.py): add exception mapping coverage for authentication scenarios
|
2023-11-03 18:25:34 -07:00 |
|
Krrish Dholakia
|
142750adff
|
fix(bedrock.py): fix bedrock exception mapping
|
2023-11-03 18:14:12 -07:00 |
|
ishaan-jaff
|
49650af444
|
(fix) bedrock: remove check for - if "error" in outputText
|
2023-11-03 18:06:23 -07:00 |
|
ishaan-jaff
|
34751d8562
|
(test) add text_completion with prompt list
|
2023-11-03 18:03:19 -07:00 |
|
ishaan-jaff
|
df57e9247a
|
(fix) hf calculating usage non blocking
|
2023-11-03 18:03:19 -07:00 |
|
Krrish Dholakia
|
4e1885734a
|
refactor(proxy_server.py): print statement showing how to add debug for logs
|
2023-11-03 17:41:14 -07:00 |
|
ishaan-jaff
|
6c4816e214
|
(fix) remove print statements
|
2023-11-03 16:45:28 -07:00 |
|
ishaan-jaff
|
f3dc06da04
|
(fix) temp_top_logprobs
|
2023-11-03 16:45:10 -07:00 |
|
ishaan-jaff
|
e29b2e8ce4
|
(fix) testing text_completion
|
2023-11-03 16:39:06 -07:00 |
|
ishaan-jaff
|
382e31d7b7
|
(test) text_completion requests
|
2023-11-03 16:36:38 -07:00 |
|
ishaan-jaff
|
b8d0a0fbd1
|
(test) cleanup remove text_completion testing from completion()
|
2023-11-03 16:36:38 -07:00 |
|
ishaan-jaff
|
0fa7c1ec3a
|
(feat) text_com support batches for non openai llms
|
2023-11-03 16:36:38 -07:00 |
|
ishaan-jaff
|
b45d438e63
|
(fix) proxy server remove errant print
|
2023-11-03 16:36:38 -07:00 |
|
Krrish Dholakia
|
7ed8f8dac8
|
fix(proxy_server.py): fix linting issues
|
2023-11-03 13:44:35 -07:00 |
|
ishaan-jaff
|
af674701af
|
(test) add palm streaming
|
2023-11-03 13:09:39 -07:00 |
|
ishaan-jaff
|
6fc0c74878
|
(fix) remove errant print statements
|
2023-11-03 13:02:52 -07:00 |
|
ishaan-jaff
|
89e32db321
|
(fix) remove errant tg ai print statements
|
2023-11-03 12:59:23 -07:00 |
|
ishaan-jaff
|
43e69a7aa8
|
(test) vertex ai
|
2023-11-03 12:54:36 -07:00 |
|
ishaan-jaff
|
6c82abf5bf
|
(fix) vertex ai streaming
|
2023-11-03 12:54:36 -07:00 |
|
Krrish Dholakia
|
6b3671b593
|
fix(proxy_server.py): accept config.yaml
|
2023-11-03 12:50:52 -07:00 |
|
ishaan-jaff
|
539bdae364
|
(fix) remove errant print statements
|
2023-11-03 08:20:14 -07:00 |
|
ishaan-jaff
|
8b389e9e8a
|
(fix) proxy correctly handle reading data using ast, fallback to json.loads if ast parse fails
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
4d82c81531
|
(fix) proxy cli tests
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
104239dbe7
|
(fix) proxy server - verbose = False always
|
2023-11-02 21:14:08 -07:00 |
|
Krrish Dholakia
|
e3a1c58dd9
|
build(litellm_server/utils.py): add support for general settings + num retries as a module variable
|
2023-11-02 20:56:41 -07:00 |
|
ishaan-jaff
|
3f1b4c0759
|
(fix) linting fix
|
2023-11-02 17:28:45 -07:00 |
|
ishaan-jaff
|
7c87d613ed
|
(testing) test_function_call_non_openai_model
|
2023-11-02 17:26:30 -07:00 |
|
ishaan-jaff
|
bb832e38b9
|
(fix) litellm utils
|
2023-11-02 17:03:46 -07:00 |
|
ishaan-jaff
|
9d090db2d4
|
(test) vertex llms
|
2023-11-02 16:31:13 -07:00 |
|
ishaan-jaff
|
81f608dd34
|
(fix) vertexai detect code_chat and code_text llms as vertex
|
2023-11-02 16:31:13 -07:00 |
|
Krrish Dholakia
|
512a1637eb
|
feat(completion()): enable setting prompt templates via completion()
|
2023-11-02 16:24:01 -07:00 |
|
ishaan-jaff
|
1fc726d5dd
|
(fix) cleanup
|
2023-11-02 14:52:33 -07:00 |
|
ishaan-jaff
|
5a471caf16
|
(test) add cohere embedding v3
|
2023-11-02 10:17:40 -07:00 |
|
ishaan-jaff
|
03860984eb
|
(feat) add setting input_type for cohere
|
2023-11-02 10:16:35 -07:00 |
|
ishaan-jaff
|
724e169f32
|
(fix) improve cohere error handling
|
2023-11-02 10:07:11 -07:00 |
|
ishaan-jaff
|
744e69f01f
|
(feat) add embed-english-v3.0
|
2023-11-02 10:05:22 -07:00 |
|
ishaan-jaff
|
76c007d542
|
(test) promptlayer logger
|
2023-11-02 08:18:00 -07:00 |
|
ishaan-jaff
|
2c18f7d374
|
(fix) logging callbacks - promptlayer
|
2023-11-02 08:18:00 -07:00 |
|
Krrish Dholakia
|
1bef2c62a6
|
fix(proxy_server.py): fix linting issues
|
2023-11-02 08:07:20 -07:00 |
|
Krrish Dholakia
|
740460f390
|
fix(main.py): expose custom llm provider for text completions
|
2023-11-02 07:55:54 -07:00 |
|
Krrish Dholakia
|
cb20554c79
|
fix(proxy_server.py): accept single quote json body
|
2023-11-02 07:07:38 -07:00 |
|
Krrish Dholakia
|
a12d0508c4
|
fix(proxy_server.py): fix v1/models get request
|
2023-11-02 06:51:47 -07:00 |
|
ishaan-jaff
|
0a41443c3e
|
(test) hf text_completion
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
8b5ee89d82
|
(feat) add transform_logprobs for text_completion
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
8ca7af3a63
|
(feat) text completion set top_n_tokens for tgi
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
19737f95c5
|
(feat) proxy add testing for openai.Completion.create
|
2023-11-01 18:25:13 -07:00 |
|