ishaan-jaff
|
382e31d7b7
|
(test) text_completion requests
|
2023-11-03 16:36:38 -07:00 |
|
ishaan-jaff
|
b8d0a0fbd1
|
(test) cleanup remove text_completion testing from completion()
|
2023-11-03 16:36:38 -07:00 |
|
ishaan-jaff
|
0fa7c1ec3a
|
(feat) text_com support batches for non openai llms
|
2023-11-03 16:36:38 -07:00 |
|
ishaan-jaff
|
b45d438e63
|
(fix) proxy server remove errant print
|
2023-11-03 16:36:38 -07:00 |
|
Krrish Dholakia
|
432520ffcf
|
docs(simple_proxy.md): improving tutorial
|
2023-11-03 14:06:08 -07:00 |
|
Krrish Dholakia
|
22fd8953c1
|
docs(simple_proxy.md): adding docs
|
2023-11-03 14:03:48 -07:00 |
|
Krrish Dholakia
|
7ed8f8dac8
|
fix(proxy_server.py): fix linting issues
|
2023-11-03 13:44:35 -07:00 |
|
ishaan-jaff
|
b5751bd040
|
(cleanup) cookbook notebooks
|
2023-11-03 13:29:28 -07:00 |
|
ishaan-jaff
|
a8d65741f5
|
(cleanup) cookbooks
|
2023-11-03 13:25:10 -07:00 |
|
Ishaan Jaff
|
fe0699cc6c
|
Update README.md
|
2023-11-03 13:15:39 -07:00 |
|
Ishaan Jaff
|
9abfb45f0c
|
Update README.md
|
2023-11-03 13:12:09 -07:00 |
|
ishaan-jaff
|
af674701af
|
(test) add palm streaming
|
2023-11-03 13:09:39 -07:00 |
|
Krrish Dholakia
|
9f180831fe
|
docs(simple_proxy.md): doc cleanup
|
2023-11-03 13:07:07 -07:00 |
|
ishaan-jaff
|
6fc0c74878
|
(fix) remove errant print statements
|
2023-11-03 13:02:52 -07:00 |
|
ishaan-jaff
|
89e32db321
|
(fix) remove errant tg ai print statements
|
2023-11-03 12:59:23 -07:00 |
|
Krrish Dholakia
|
f6486b9df9
|
docs(simple_proxy.md): adding bedrock tutorial
|
2023-11-03 12:58:48 -07:00 |
|
ishaan-jaff
|
43e69a7aa8
|
(test) vertex ai
|
2023-11-03 12:54:36 -07:00 |
|
ishaan-jaff
|
6c82abf5bf
|
(fix) vertex ai streaming
|
2023-11-03 12:54:36 -07:00 |
|
Krrish Dholakia
|
6b3671b593
|
fix(proxy_server.py): accept config.yaml
|
2023-11-03 12:50:52 -07:00 |
|
ishaan-jaff
|
e09b4cb01a
|
bump: version 0.12.11 → 0.12.12
|
2023-11-03 10:04:14 -07:00 |
|
ishaan-jaff
|
539bdae364
|
(fix) remove errant print statements
|
2023-11-03 08:20:14 -07:00 |
|
ishaan-jaff
|
8b389e9e8a
|
(fix) proxy correctly handle reading data using ast, fallback to json.loads if ast parse fails
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
4d82c81531
|
(fix) proxy cli tests
|
2023-11-02 21:14:08 -07:00 |
|
ishaan-jaff
|
104239dbe7
|
(fix) proxy server - verbose = False always
|
2023-11-02 21:14:08 -07:00 |
|
Krrish Dholakia
|
e3a1c58dd9
|
build(litellm_server/utils.py): add support for general settings + num retries as a module variable
|
2023-11-02 20:56:41 -07:00 |
|
ishaan-jaff
|
3f1b4c0759
|
(fix) linting fix
|
2023-11-02 17:28:45 -07:00 |
|
ishaan-jaff
|
7c87d613ed
|
(testing) test_function_call_non_openai_model
|
2023-11-02 17:26:30 -07:00 |
|
ishaan-jaff
|
e040df8989
|
bump: version 0.12.10 → 0.12.11
|
2023-11-02 17:15:38 -07:00 |
|
ishaan-jaff
|
bb832e38b9
|
(fix) litellm utils
|
2023-11-02 17:03:46 -07:00 |
|
ishaan-jaff
|
9d090db2d4
|
(test) vertex llms
|
2023-11-02 16:31:13 -07:00 |
|
ishaan-jaff
|
81f608dd34
|
(fix) vertexai detect code_chat and code_text llms as vertex
|
2023-11-02 16:31:13 -07:00 |
|
Krrish Dholakia
|
512a1637eb
|
feat(completion()): enable setting prompt templates via completion()
|
2023-11-02 16:24:01 -07:00 |
|
ishaan-jaff
|
1fc726d5dd
|
(fix) cleanup
|
2023-11-02 14:52:33 -07:00 |
|
ishaan-jaff
|
9484d19920
|
(docs) caching
|
2023-11-02 14:51:07 -07:00 |
|
ishaan-jaff
|
764ba41224
|
(docs) embedding
|
2023-11-02 10:57:57 -07:00 |
|
ishaan-jaff
|
f24e1cb133
|
(docs) embedding cohere
|
2023-11-02 10:56:24 -07:00 |
|
ishaan-jaff
|
4dfefc475d
|
(docs) cohere embedding
|
2023-11-02 10:45:56 -07:00 |
|
ishaan-jaff
|
fdc88cc0ee
|
(docs) cohere embedding models
|
2023-11-02 10:35:46 -07:00 |
|
ishaan-jaff
|
8ea2270058
|
(feat) cohere embedding
|
2023-11-02 10:26:02 -07:00 |
|
ishaan-jaff
|
33350c7c41
|
(docs) cohere
|
2023-11-02 10:23:52 -07:00 |
|
ishaan-jaff
|
5a471caf16
|
(test) add cohere embedding v3
|
2023-11-02 10:17:40 -07:00 |
|
ishaan-jaff
|
03860984eb
|
(feat) add setting input_type for cohere
|
2023-11-02 10:16:35 -07:00 |
|
ishaan-jaff
|
724e169f32
|
(fix) improve cohere error handling
|
2023-11-02 10:07:11 -07:00 |
|
ishaan-jaff
|
744e69f01f
|
(feat) add embed-english-v3.0
|
2023-11-02 10:05:22 -07:00 |
|
ishaan-jaff
|
d09486f93b
|
(feat) model context window.json add ollama/mistral, ollama/codellama, ollama/orca-mini, ollama/vicuna
|
2023-11-02 09:18:26 -07:00 |
|
ishaan-jaff
|
6b7b66aa96
|
bump: version 0.12.9 → 0.12.10
|
2023-11-02 08:50:43 -07:00 |
|
ishaan-jaff
|
76c007d542
|
(test) promptlayer logger
|
2023-11-02 08:18:00 -07:00 |
|
ishaan-jaff
|
2c18f7d374
|
(fix) logging callbacks - promptlayer
|
2023-11-02 08:18:00 -07:00 |
|
Krrish Dholakia
|
1bef2c62a6
|
fix(proxy_server.py): fix linting issues
|
2023-11-02 08:07:20 -07:00 |
|
Krrish Dholakia
|
740460f390
|
fix(main.py): expose custom llm provider for text completions
|
2023-11-02 07:55:54 -07:00 |
|