litellm-mirror/tests/llm_translation/test_bedrock_llama.py
Ishaan Jaff 125f6fff67
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 14s
(Feat) - Add /bedrock/meta.llama3-3-70b-instruct-v1:0 tool calling support + cost tracking + base llm unit test for tool calling (#8545)
* Add support for bedrock meta.llama3-3-70b-instruct-v1:0 tool calling (#8512)

* fix(converse_transformation.py): fixing bedrock meta.llama3-3-70b tool calling

* test(test_bedrock_completion.py): adding llama3.3 tool compatibility check

* add TestBedrockTestSuite

* add bedrock llama 3.3 to base llm class

* us.meta.llama3-3-70b-instruct-v1:0

* test_basic_tool_calling

* TestAzureOpenAIO1

* test_basic_tool_calling

* test_basic_tool_calling

---------

Co-authored-by: miraclebakelaser <65143272+miraclebakelaser@users.noreply.github.com>
2025-02-14 14:15:25 -08:00

20 lines
515 B
Python

from base_llm_unit_tests import BaseLLMChatTest
import pytest
import sys
import os
sys.path.insert(
0, os.path.abspath("../..")
) # Adds the parent directory to the system path
import litellm
class TestBedrockTestSuite(BaseLLMChatTest):
def test_tool_call_no_arguments(self, tool_call_no_arguments):
pass
def get_base_completion_call_args(self) -> dict:
litellm._turn_on_debug()
return {
"model": "bedrock/converse/us.meta.llama3-3-70b-instruct-v1:0",
}