diff --git a/docs/my-website/docs/completion/supported.md b/docs/my-website/docs/completion/supported.md index ea8978066..750c98ef5 100644 --- a/docs/my-website/docs/completion/supported.md +++ b/docs/my-website/docs/completion/supported.md @@ -134,6 +134,21 @@ Example TogetherAI Usage - Note: liteLLM supports all models deployed on Togethe | togethercomputer/creative-v1 | `completion('togethercomputer/creative-v1', messages)` | `os.environ['TOGETHERAI_API_KEY']` | | togethercomputer/yourmodel | `completion('togethercomputer/yourmodel', messages)` | `os.environ['TOGETHERAI_API_KEY']` | +### AWS Sagemaker Models +https://aws.amazon.com/sagemaker/ Use liteLLM to easily call custom LLMs on Sagemaker + +### Requirements using Sagemaker with LiteLLM + +* `pip install boto3` +* Set the following AWS credentials as .env variables (Sagemaker auth: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html) + * AWS_ACCESS_KEY_ID + * AWS_SECRET_ACCESS_KEY + * AWS_REGION_NAME + + +| Model Name | Function Call | Required OS Variables | +|------------------|--------------------------------------------|------------------------------------| +| Llama2 7B | `completion(model='sagemaker/jumpstart-dft-meta-textgeneration-llama-2-7b, messages=messages)` | `os.environ['AWS_ACCESS_KEY_ID']`, `os.environ['AWS_SECRET_ACCESS_KEY']`, `os.environ['AWS_REGION_NAME']` | ### Baseten Models Baseten provides infrastructure to deploy and serve ML models https://www.baseten.co/. Use liteLLM to easily call models deployed on Baseten.