litellm/litellm/llms
2024-05-11 21:24:42 -07:00
..
custom_httpx feat(bedrock_httpx.py): working bedrock command-r sync+async streaming 2024-05-11 19:39:51 -07:00
huggingface_llms_metadata add hf tgi and conversational models 2023-09-27 15:56:45 -07:00
prompt_templates Merge pull request #3369 from mogith-pn/main 2024-05-11 09:31:46 -07:00
tokenizers fix(openai.py): return logprobs for text completion calls 2024-04-02 14:05:56 -07:00
__init__.py add linting 2023-08-18 11:05:05 -07:00
ai21.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
aleph_alpha.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
anthropic.py fix(anthropic.py): fix tool calling + streaming issue 2024-05-11 20:15:36 -07:00
anthropic_text.py fix(anthropic_text.py): fix linting error 2024-05-11 20:01:50 -07:00
azure.py Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting 2024-05-11 11:36:22 -07:00
azure_text.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
base.py fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
baseten.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
bedrock.py Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting 2024-05-11 11:36:22 -07:00
bedrock_httpx.py fix(bedrock_httpx.py): compatibility fix 2024-05-11 19:55:38 -07:00
clarifai.py Clarifai - Added streaming and async completion support 2024-05-03 14:03:38 +00:00
cloudflare.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
cohere.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
cohere_chat.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
gemini.py fix(utils.py): fix streaming to not return usage dict 2024-04-24 08:06:07 -07:00
huggingface_restapi.py docs(huggingface.md): add text-classification to huggingface docs 2024-05-10 14:39:14 -07:00
maritalk.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
nlp_cloud.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
ollama.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
ollama_chat.py Make newline same in async function 2024-05-05 18:51:53 -07:00
oobabooga.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
openai.py stream_options for text-completionopenai 2024-05-09 08:37:40 -07:00
openrouter.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
palm.py fix(utils.py): fix streaming to not return usage dict 2024-04-24 08:06:07 -07:00
petals.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
predibase.py fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
replicate.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
sagemaker.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
together_ai.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
triton.py feat - triton embeddings 2024-05-10 18:57:06 -07:00
vertex_ai.py feat(router.py): support region routing for bedrock, vertex ai, watsonx 2024-05-11 11:04:00 -07:00
vertex_ai_anthropic.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
vllm.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
watsonx.py Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting 2024-05-11 11:36:22 -07:00