litellm/cookbook
2024-04-24 17:19:02 +02:00
..
benchmark
codellama-server
community-resources
litellm-ollama-docker-image
litellm_router
litellm_router_load_test
logging_observability
misc
proxy-server
Benchmarking_LLMs_by_use_case.ipynb
Claude_(Anthropic)_with_Streaming_liteLLM_Examples.ipynb
Evaluating_LLMs.ipynb
liteLLM_A121_Jurrasic_example.ipynb
LiteLLM_Azure_and_OpenAI_example.ipynb
liteLLM_Baseten.ipynb
LiteLLM_batch_completion.ipynb
LiteLLM_Bedrock.ipynb
LiteLLM_Comparing_LLMs.ipynb
LiteLLM_Completion_Cost.ipynb
liteLLM_function_calling.ipynb
liteLLM_Getting_Started.ipynb
LiteLLM_HuggingFace.ipynb
liteLLM_IBM_Watsonx.ipynb
liteLLM_Langchain_Demo.ipynb
litellm_model_fallback.ipynb
liteLLM_Ollama.ipynb
LiteLLM_OpenRouter.ipynb
LiteLLM_Petals.ipynb
LiteLLM_PromptLayer.ipynb
liteLLM_Replicate_Demo.ipynb
liteLLM_Streaming_Demo.ipynb
litellm_test_multiple_llm_demo.ipynb
litellm_Test_Multiple_Providers.ipynb
LiteLLM_User_Based_Rate_Limits.ipynb
liteLLM_VertextAI_Example.ipynb
Parallel_function_calling.ipynb
Proxy_Batch_Users.ipynb
result.html
TogetherAI_liteLLM.ipynb
Using_Nemo_Guardrails_with_LiteLLM_Server.ipynb
VLLM_Model_Testing.ipynb