Krrish Dholakia
|
4e51f712f3
|
fix(main.py): fix calling openai gpt-3.5-turbo-instruct via /completions
Fixes https://github.com/BerriAI/litellm/issues/749
|
2024-07-25 09:57:19 -07:00 |
|
Krrish Dholakia
|
fdb7101aaf
|
fix(utils.py): add extra body params for text completion calls
|
2024-06-21 08:28:38 -07:00 |
|
Krrish Dholakia
|
9cc104eb03
|
fix(main.py): route openai calls to /completion when text_completion is True
|
2024-06-19 12:37:05 -07:00 |
|
Krish Dholakia
|
3a3b3667ee
|
Merge branch 'main' into litellm_aws_kms_fixes
|
2024-06-19 09:30:54 -07:00 |
|
Ishaan Jaff
|
5a28875e77
|
fix text completion response from codestral
|
2024-06-17 15:01:26 -07:00 |
|
Ishaan Jaff
|
2ce032405d
|
test - codestral streaming
|
2024-06-17 14:17:25 -07:00 |
|
Ishaan Jaff
|
364492297d
|
feat - add fim codestral api
|
2024-06-17 13:46:03 -07:00 |
|
Ishaan Jaff
|
ad47fee181
|
feat add text completion config for mistral text
|
2024-06-17 12:48:46 -07:00 |
|
Ishaan Jaff
|
5f76f96e4d
|
working chat, text for codestral
|
2024-06-17 11:30:22 -07:00 |
|
Krrish Dholakia
|
e6c96aa950
|
fix(utils.py): fix tgai timeout exception mapping + skip flaky test
|
2024-06-10 19:50:16 -07:00 |
|
Krrish Dholakia
|
82149b8cf4
|
test(test_text_completion.py): skip unstable test
|
2024-06-10 19:45:24 -07:00 |
|
Krrish Dholakia
|
a854824c02
|
fix(main.py): fix together ai text completion call
|
2024-05-08 09:10:45 -07:00 |
|
Krrish Dholakia
|
475144e5b7
|
fix(openai.py): support passing prompt as list instead of concat string
|
2024-04-03 15:23:20 -07:00 |
|
Krrish Dholakia
|
15e0099948
|
fix(proxy_server.py): return original model response via response headers - /v1/completions
to help devs with debugging
|
2024-04-03 13:05:43 -07:00 |
|
Krrish Dholakia
|
f17dd68df3
|
test(test_text_completion.py): unit testing for text completion pydantic object
|
2024-04-03 12:26:51 -07:00 |
|
Krrish Dholakia
|
b07788d2a5
|
fix(openai.py): return logprobs for text completion calls
|
2024-04-02 14:05:56 -07:00 |
|
Krrish Dholakia
|
dc2c4af631
|
fix(utils.py): fix text completion streaming
|
2024-03-25 16:47:17 -07:00 |
|
ishaan-jaff
|
1bd92b20dd
|
(test-fix) hf is unstable
|
2024-03-04 08:44:39 -08:00 |
|
ishaan-jaff
|
4e6a238820
|
(test) hf currently loading error
|
2024-01-15 17:07:49 -08:00 |
|
ishaan-jaff
|
b5f9f05491
|
(test) fix - skip HF is currently loading exception
|
2024-01-09 15:53:19 +05:30 |
|
Krrish Dholakia
|
e97eff4243
|
test(test_router.py): fix router test
|
2024-01-09 11:08:35 +05:30 |
|
ishaan-jaff
|
f46fa2b8a8
|
(fix) test - deprecated textdavinci003
|
2024-01-09 10:55:35 +05:30 |
|
Krrish Dholakia
|
88d498a54a
|
fix(ollama.py): use tiktoken as backup for prompt token counting
|
2024-01-09 09:47:18 +05:30 |
|
Krrish Dholakia
|
6333fbfe56
|
fix(main.py): support cost calculation for text completion streaming object
|
2024-01-08 12:41:43 +05:30 |
|
Krrish Dholakia
|
e06840b571
|
refactor: move async text completion testing to test_text_completion.py
|
2023-12-29 11:46:40 +05:30 |
|
ishaan-jaff
|
2b8e2bd937
|
(ci/cd) set num retries for HF test
|
2023-12-29 10:52:45 +05:30 |
|
ishaan-jaff
|
8528d9f809
|
(test) gpt-3.5-turbo-instruct finish reason
|
2023-12-27 15:45:40 +05:30 |
|
Krrish Dholakia
|
4905929de3
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
|
Krrish Dholakia
|
34509d8dda
|
fix(main.py): return async completion calls
|
2023-12-18 17:41:54 -08:00 |
|
Krrish Dholakia
|
1608dd7e0b
|
fix(main.py): support async streaming for text completions endpoint
|
2023-12-14 13:56:32 -08:00 |
|
ishaan-jaff
|
426c741b40
|
(test) hf text completion
|
2023-12-11 10:27:43 -08:00 |
|
Krrish Dholakia
|
3ea776bdc0
|
fix(text_completion): allow either model or engine to be set
|
2023-11-17 18:25:21 -08:00 |
|
ishaan-jaff
|
2dc411fdb3
|
(test) hf streaming
|
2023-11-16 12:24:31 -08:00 |
|
ishaan-jaff
|
3285113d2d
|
(test) regular hf tests
|
2023-11-16 12:00:49 -08:00 |
|
ishaan-jaff
|
baf4e83738
|
(test) text_completion
|
2023-11-16 11:37:46 -08:00 |
|
Krrish Dholakia
|
1665b872c3
|
fix(caching.py): dump model response object as json
|
2023-11-13 10:41:04 -08:00 |
|
ishaan-jaff
|
4dd7e2519f
|
(test) add text completion steaming test
|
2023-11-08 11:59:31 -08:00 |
|
ishaan-jaff
|
55eb274f36
|
(test) remove dup hf test
|
2023-11-08 09:54:40 -08:00 |
|
ishaan-jaff
|
39ca8b5043
|
bump: version 0.13.7 → 0.13.8
|
2023-11-08 09:46:57 -08:00 |
|
ishaan-jaff
|
7219fcb968
|
(test) hf inference api - text_completion
|
2023-11-06 17:56:41 -08:00 |
|
ishaan-jaff
|
592fc12710
|
(fix) linting errors
|
2023-11-06 14:42:12 -08:00 |
|
ishaan-jaff
|
1df6dd986d
|
(test) hf max tokens, temp
|
2023-11-06 14:38:21 -08:00 |
|
ishaan-jaff
|
07761ac93f
|
(test) text_completion
|
2023-11-06 13:17:49 -08:00 |
|
ishaan-jaff
|
f591d79376
|
(fix) linting fixes
|
2023-11-06 13:02:11 -08:00 |
|
ishaan-jaff
|
4344ae66b8
|
(test) text completion
|
2023-11-06 09:10:48 -08:00 |
|
ishaan-jaff
|
1ad450218d
|
(test) text completion
|
2023-11-06 08:37:01 -08:00 |
|
ishaan-jaff
|
e5e9c0f9df
|
(test) text_completion more testing
|
2023-11-06 08:36:50 -08:00 |
|
Krrish Dholakia
|
763ecf681a
|
test(test_text_completion.py): fixing print verbose
|
2023-11-04 14:03:09 -07:00 |
|
ishaan-jaff
|
3e508ea257
|
(test) text_completion responses
|
2023-11-03 22:14:36 -07:00 |
|
ishaan-jaff
|
34751d8562
|
(test) add text_completion with prompt list
|
2023-11-03 18:03:19 -07:00 |
|