litellm-mirror/litellm/llms
Krish Dholakia 5ee3b0f30f
Merge pull request #3996 from BerriAI/litellm_azure_assistants_api_support
feat(assistants/main.py): Azure Assistants API support
2024-06-03 21:05:03 -07:00
..
custom_httpx fix(http_handler.py): allow setting ca bundle path 2024-06-01 14:48:53 -07:00
huggingface_llms_metadata add hf tgi and conversational models 2023-09-27 15:56:45 -07:00
prompt_templates fix(factory.py): fix linting error 2024-05-24 19:12:09 -07:00
tokenizers fix(openai.py): return logprobs for text completion calls 2024-04-02 14:05:56 -07:00
__init__.py add linting 2023-08-18 11:05:05 -07:00
ai21.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
aleph_alpha.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
anthropic.py fix(anthropic.py): fix anthropic async streaming 2024-06-02 16:01:44 -07:00
anthropic_text.py fix(anthropic_text.py): fix linting error 2024-05-11 20:01:50 -07:00
azure.py Merge pull request #3996 from BerriAI/litellm_azure_assistants_api_support 2024-06-03 21:05:03 -07:00
azure_text.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
base.py fix(bedrock_httpx.py): move anthropic bedrock calls to httpx 2024-05-16 21:51:55 -07:00
baseten.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
bedrock.py Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting 2024-05-11 11:36:22 -07:00
bedrock_httpx.py Merge pull request #3944 from BerriAI/litellm_fix_parallel_streaming 2024-05-31 21:42:37 -07:00
clarifai.py docs(input.md): add clarifai supported input params to docs 2024-05-24 08:57:50 -07:00
cloudflare.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
cohere.py Add request source 2024-05-21 10:12:57 +01:00
cohere_chat.py Add request source 2024-05-21 10:12:57 +01:00
databricks.py fix: fix streaming with httpx client 2024-05-31 10:55:18 -07:00
gemini.py fix - choices index for gemini/ provider 2024-05-16 13:52:46 -07:00
huggingface_restapi.py fix(huggingface_restapi.py): fix task extraction from model name 2024-05-15 07:28:19 -07:00
maritalk.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
nlp_cloud.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
ollama.py fix: add missing seed parameter to ollama input 2024-05-31 01:47:56 +08:00
ollama_chat.py fix: add missing seed parameter to ollama input 2024-05-31 01:47:56 +08:00
oobabooga.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
openai.py feat(assistants/main.py): Closes https://github.com/BerriAI/litellm/issues/3993 2024-06-03 18:47:05 -07:00
openrouter.py refactor: add black formatting 2023-12-25 14:11:20 +05:30
palm.py fix(utils.py): fix streaming to not return usage dict 2024-04-24 08:06:07 -07:00
petals.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
predibase.py fix: fix streaming with httpx client 2024-05-31 10:55:18 -07:00
replicate.py fix: fix streaming with httpx client 2024-05-31 10:55:18 -07:00
sagemaker.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
together_ai.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
triton.py Revert "Added support for Triton chat completion using trtlllm generate endpo…" 2024-05-29 13:42:49 -07:00
vertex_ai.py fix - vertex ai cache clients 2024-05-30 21:22:32 -07:00
vertex_ai_anthropic.py (ci/cd) run again 2024-05-24 21:27:29 -07:00
vertex_httpx.py fix vertex httpx client 2024-05-20 13:43:54 -07:00
vllm.py feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
watsonx.py Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting 2024-05-11 11:36:22 -07:00