.. |
custom_httpx
|
fix(router.py): only do sync image gen fallbacks for now
|
2023-12-20 19:10:59 +05:30 |
huggingface_llms_metadata
|
add hf tgi and conversational models
|
2023-09-27 15:56:45 -07:00 |
prompt_templates
|
feat(ollama.py): add support for ollama function calling
|
2023-12-20 14:59:55 +05:30 |
tokenizers
|
adding support for cohere, anthropic, llama2 tokenizers
|
2023-09-22 14:03:52 -07:00 |
__init__.py
|
add linting
|
2023-08-18 11:05:05 -07:00 |
ai21.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
aleph_alpha.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
anthropic.py
|
feat(factory.py): add support for anthropic system prompts for claude 2.1
|
2023-11-21 09:57:26 -08:00 |
azure.py
|
feat(main.py): add async image generation support
|
2023-12-20 16:58:40 +05:30 |
base.py
|
test: set request timeout at request level
|
2023-11-15 17:42:31 -08:00 |
baseten.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
bedrock.py
|
bump: version 1.14.4 → 1.14.5.dev1
|
2023-12-14 15:23:52 -08:00 |
cohere.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
huggingface_restapi.py
|
(feat) show POST request for HF embeddings
|
2023-12-16 13:09:49 +05:30 |
maritalk.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
nlp_cloud.py
|
fix: fix nlp cloud streaming
|
2023-11-25 13:45:23 -08:00 |
ollama.py
|
feat(ollama.py): add support for ollama function calling
|
2023-12-20 14:59:55 +05:30 |
oobabooga.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
openai.py
|
feat(main.py): add async image generation support
|
2023-12-20 16:58:40 +05:30 |
openrouter.py
|
(feat) openrouter set transforms=[] default
|
2023-12-18 09:16:33 +05:30 |
palm.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
petals.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
replicate.py
|
Merge pull request #1080 from nbaldwin98/fixing-replicate-sys-prompt
|
2023-12-11 07:11:52 -08:00 |
sagemaker.py
|
fix(sagemaker.py): filter out templated prompt if in model response
|
2023-12-13 07:43:33 -08:00 |
together_ai.py
|
fix(together_ai.py): return empty tgai responses
|
2023-12-15 10:46:35 -08:00 |
vertex_ai.py
|
Add a default for safety settings in vertex AI
|
2023-12-20 13:12:50 -05:00 |
vllm.py
|
this commit fixes #883
|
2023-11-23 12:45:38 +01:00 |