litellm-mirror/litellm/llms
2023-09-23 15:04:59 -07:00
..
prompt_templates fix meta llama prompt template mapping bug 2023-09-18 21:24:41 -07:00
tokenizers adding support for cohere, anthropic, llama2 tokenizers 2023-09-22 14:03:52 -07:00
__init__.py add linting 2023-08-18 11:05:05 -07:00
ai21.py map finish reason 2023-09-13 19:22:38 -07:00
aleph_alpha.py adding finish reason mapping for aleph alpha and baseten 2023-09-13 19:39:11 -07:00
anthropic.py add claude max_tokens_to_sample 2023-09-22 20:57:52 -07:00
base.py all fixes to linting 2023-08-18 11:56:44 -07:00
baseten.py adding finish reason mapping for aleph alpha and baseten 2023-09-13 19:39:11 -07:00
bedrock.py streaming for amazon titan bedrock 2023-09-16 09:57:16 -07:00
cohere.py move cohere to http endpoint 2023-09-14 11:17:38 -07:00
huggingface_restapi.py expose vertex ai and hf api base as env var 2023-09-22 15:14:33 -07:00
nlp_cloud.py adding support for nlp cloud 2023-09-14 09:19:34 -07:00
ollama.py fix async import error 2023-09-21 11:16:50 -07:00
oobabooga.py add oobabooga text web api support 2023-09-19 18:56:53 -07:00
petals.py remove cuda from petals 2023-09-20 09:23:39 -07:00
replicate.py fix exception mapping for streaming 2023-09-23 15:04:59 -07:00
sagemaker.py bump version with bedrock 2023-09-14 14:54:36 -07:00
together_ai.py remove tg ai print 2023-09-15 09:29:39 -07:00
vllm.py raise vllm error 2023-09-08 15:27:01 -07:00