.. |
huggingface_llms_metadata
|
add hf tgi and conversational models
|
2023-09-27 15:56:45 -07:00 |
prompt_templates
|
fix(factory.py): fix tgai rendering template
|
2023-12-13 12:27:31 -08:00 |
tokenizers
|
adding support for cohere, anthropic, llama2 tokenizers
|
2023-09-22 14:03:52 -07:00 |
__init__.py
|
add linting
|
2023-08-18 11:05:05 -07:00 |
ai21.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
aleph_alpha.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
anthropic.py
|
feat(factory.py): add support for anthropic system prompts for claude 2.1
|
2023-11-21 09:57:26 -08:00 |
azure.py
|
test(test_custom_callback_input.py): embedding callback tests for azure, openai, bedrock
|
2023-12-11 15:32:46 -08:00 |
base.py
|
test: set request timeout at request level
|
2023-11-15 17:42:31 -08:00 |
baseten.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
bedrock.py
|
bump: version 1.14.4 → 1.14.5.dev1
|
2023-12-14 15:23:52 -08:00 |
cohere.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
huggingface_restapi.py
|
fix(huggingface_restapi.py): raise better exceptions for unprocessable hf responses
|
2023-12-05 07:28:21 -08:00 |
maritalk.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
nlp_cloud.py
|
fix: fix nlp cloud streaming
|
2023-11-25 13:45:23 -08:00 |
ollama.py
|
fix(ollama.py): fix ollama async streaming for /completions calls
|
2023-12-15 09:28:32 -08:00 |
oobabooga.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
openai.py
|
(feat) - acompletion, correct exception mapping
|
2023-12-15 08:28:12 +05:30 |
palm.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
petals.py
|
(fix) pydantic errors with response.time
|
2023-11-20 18:28:19 -08:00 |
replicate.py
|
Merge pull request #1080 from nbaldwin98/fixing-replicate-sys-prompt
|
2023-12-11 07:11:52 -08:00 |
sagemaker.py
|
fix(sagemaker.py): filter out templated prompt if in model response
|
2023-12-13 07:43:33 -08:00 |
together_ai.py
|
fix(together_ai.py): additional logging for together ai encoding prompt
|
2023-12-15 10:39:23 -08:00 |
vertex_ai.py
|
fix(vertex_ai.py): add exception mapping for acompletion calls
|
2023-12-13 16:35:50 -08:00 |
vllm.py
|
this commit fixes #883
|
2023-11-23 12:45:38 +01:00 |