diff --git a/docs/my-website/docs/providers/bedrock.md b/docs/my-website/docs/providers/bedrock.md new file mode 100644 index 000000000..d24edc1fa --- /dev/null +++ b/docs/my-website/docs/providers/bedrock.md @@ -0,0 +1,35 @@ +# AWS Bedrock + +### API KEYS +```python +!pip install boto3 + +os.environ["AWS_ACCESS_KEY_ID"] = "" +os.environ["AWS_SECRET_ACCESS_KEY"] = "" +os.environ["AWS_REGION_NAME"] = "" +``` + +### Usage +```python +import os +from litellm import completion + +os.environ["AWS_ACCESS_KEY_ID"] = "" +os.environ["AWS_SECRET_ACCESS_KEY"] = "" +os.environ["AWS_REGION_NAME"] = "" + +response = completion( + model="bedrock/amazon.titan-tg1-large", + messages=[{ "content": "Hello, how are you?","role": "user"}], + temperature=0.2, + max_tokens=80, +) +``` + +### Supported AWS Bedrock Models +Here's an example of using a bedrock model with LiteLLM + +| Model Name | Function Call | Required OS Variables | +|------------------|--------------------------------------------|------------------------------------| +| Llama2 7B | `completion(model='sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b, messages=messages)` | `os.environ['AWS_ACCESS_KEY_ID']`, `os.environ['AWS_SECRET_ACCESS_KEY']`, `os.environ['AWS_REGION_NAME']` | +| Custom LLM Endpoint | `completion(model='sagemaker/your-endpoint, messages=messages)` | `os.environ['AWS_ACCESS_KEY_ID']`, `os.environ['AWS_SECRET_ACCESS_KEY']`, `os.environ['AWS_REGION_NAME']` |