.. |
huggingface_llms_metadata
|
add hf tgi and conversational models
|
2023-09-27 15:56:45 -07:00 |
prompt_templates
|
fix(anthropic.py-+-bedrock.py): anthropic prompt format
|
2023-10-20 10:56:15 -07:00 |
tokenizers
|
adding support for cohere, anthropic, llama2 tokenizers
|
2023-09-22 14:03:52 -07:00 |
__init__.py
|
add linting
|
2023-08-18 11:05:05 -07:00 |
ai21.py
|
fix: allow api base to be set for all providers
|
2023-10-19 19:07:42 -07:00 |
aleph_alpha.py
|
fix: allow api base to be set for all providers
|
2023-10-19 19:07:42 -07:00 |
anthropic.py
|
fix(anthropic.py-+-bedrock.py): anthropic prompt format
|
2023-10-20 10:56:15 -07:00 |
base.py
|
fix(init.py): expose complete client session
|
2023-10-10 15:16:10 -07:00 |
baseten.py
|
docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial
|
2023-10-17 09:22:34 -07:00 |
bedrock.py
|
fix(anthropic.py-+-bedrock.py): anthropic prompt format
|
2023-10-20 10:56:15 -07:00 |
cohere.py
|
fix: allow api base to be set for all providers
|
2023-10-19 19:07:42 -07:00 |
huggingface_restapi.py
|
docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial
|
2023-10-17 09:22:34 -07:00 |
nlp_cloud.py
|
fix: allow api base to be set for all providers
|
2023-10-19 19:07:42 -07:00 |
ollama.py
|
(feat) ollama raise Exceptions + use LiteLLM stream wrapper
|
2023-10-11 17:00:39 -07:00 |
oobabooga.py
|
add oobabooga text web api support
|
2023-09-19 18:56:53 -07:00 |
openai.py
|
fix(openai.py): fix linting errors
|
2023-10-13 22:24:58 -07:00 |
palm.py
|
fix: fix value error if model returns empty completion
|
2023-10-10 10:11:40 -07:00 |
petals.py
|
style: fix linting errors
|
2023-10-16 17:35:08 -07:00 |
replicate.py
|
fix: allow api base to be set for all providers
|
2023-10-19 19:07:42 -07:00 |
sagemaker.py
|
fix: fix value error if model returns empty completion
|
2023-10-10 10:11:40 -07:00 |
together_ai.py
|
fix: allow api base to be set for all providers
|
2023-10-19 19:07:42 -07:00 |
vertex_ai.py
|
fix(vertex_ai.py): fix output parsing
|
2023-10-24 12:08:22 -07:00 |
vllm.py
|
style: fix linting errors
|
2023-10-16 17:35:08 -07:00 |