.. |
custom_httpx
|
fix(http_handler.py): fix linting error
|
2024-04-19 15:45:24 -07:00 |
huggingface_llms_metadata
|
add hf tgi and conversational models
|
2023-09-27 15:56:45 -07:00 |
prompt_templates
|
fix(factory.py): support llama3 instuct chat template
|
2024-04-24 20:35:10 -07:00 |
tokenizers
|
fix(openai.py): return logprobs for text completion calls
|
2024-04-02 14:05:56 -07:00 |
__init__.py
|
add linting
|
2023-08-18 11:05:05 -07:00 |
ai21.py
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
aleph_alpha.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
anthropic.py
|
fix: Stream completion responses from anthropic. (Fix 3129)
|
2024-04-19 16:13:19 -05:00 |
anthropic_text.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
azure.py
|
fix(main.py): support max retries for transcription calls
|
2024-04-01 18:37:53 -07:00 |
azure_text.py
|
fix(openai.py): return logprobs for text completion calls
|
2024-04-02 14:05:56 -07:00 |
base.py
|
build(pyproject.toml): drop certifi dependency (unused)
|
2024-01-10 08:09:03 +05:30 |
baseten.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
bedrock.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
cloudflare.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
cohere.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
cohere_chat.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
gemini.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
huggingface_restapi.py
|
fix(huggingface_restapi.py): fix hf streaming issue
|
2024-03-04 21:16:41 -08:00 |
maritalk.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
nlp_cloud.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
ollama.py
|
Disable special tokens in ollama completion when counting tokens
|
2024-04-19 21:38:42 +02:00 |
ollama_chat.py
|
FIX: use value not param name when mapping frequency_penalty
|
2024-04-20 09:25:35 +01:00 |
oobabooga.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
openai.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
openrouter.py
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
palm.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
petals.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
replicate.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
sagemaker.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
together_ai.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
vertex_ai.py
|
Merge pull request #3267 from BerriAI/litellm_openai_streaming_fix
|
2024-04-24 21:08:33 -07:00 |
vertex_ai_anthropic.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |
vllm.py
|
fix(utils.py): fix streaming to not return usage dict
|
2024-04-24 08:06:07 -07:00 |