.. |
huggingface_llms_metadata
|
add hf tgi and conversational models
|
2023-09-27 15:56:45 -07:00 |
prompt_templates
|
fix meta llama prompt template mapping bug
|
2023-09-18 21:24:41 -07:00 |
tokenizers
|
adding support for cohere, anthropic, llama2 tokenizers
|
2023-09-22 14:03:52 -07:00 |
__init__.py
|
add linting
|
2023-08-18 11:05:05 -07:00 |
ai21.py
|
map finish reason
|
2023-09-13 19:22:38 -07:00 |
aleph_alpha.py
|
adding finish reason mapping for aleph alpha and baseten
|
2023-09-13 19:39:11 -07:00 |
anthropic.py
|
add claude max_tokens_to_sample
|
2023-09-22 20:57:52 -07:00 |
base.py
|
all fixes to linting
|
2023-08-18 11:56:44 -07:00 |
baseten.py
|
adding finish reason mapping for aleph alpha and baseten
|
2023-09-13 19:39:11 -07:00 |
bedrock.py
|
streaming for amazon titan bedrock
|
2023-09-16 09:57:16 -07:00 |
cohere.py
|
move cohere to http endpoint
|
2023-09-14 11:17:38 -07:00 |
huggingface_restapi.py
|
auto-detect HF task
|
2023-09-27 17:49:31 -07:00 |
nlp_cloud.py
|
adding support for nlp cloud
|
2023-09-14 09:19:34 -07:00 |
ollama.py
|
push cli tool
|
2023-09-26 13:30:47 -07:00 |
oobabooga.py
|
add oobabooga text web api support
|
2023-09-19 18:56:53 -07:00 |
palm.py
|
add palm warning
|
2023-09-26 10:28:13 -07:00 |
petals.py
|
remove cuda from petals
|
2023-09-20 09:23:39 -07:00 |
replicate.py
|
remove print statement in replicate.py
|
2023-09-27 10:43:06 -07:00 |
sagemaker.py
|
bump version with bedrock
|
2023-09-14 14:54:36 -07:00 |
together_ai.py
|
remove tg ai print
|
2023-09-15 09:29:39 -07:00 |
vllm.py
|
raise vllm error
|
2023-09-08 15:27:01 -07:00 |