From f9ff8cbfb1c7f068fa7b0d62d6bb5d6acb35e025 Mon Sep 17 00:00:00 2001 From: ishaan-jaff Date: Tue, 24 Oct 2023 12:53:21 -0700 Subject: [PATCH] (docs) update cloud run deploy link --- README.md | 2 +- docs/my-website/docs/simple_proxy.md | 4 ++-- openai-proxy/README.md | 2 +- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 760efae17..c2a560dc9 100644 --- a/README.md +++ b/README.md @@ -13,7 +13,7 @@ - + diff --git a/docs/my-website/docs/simple_proxy.md b/docs/my-website/docs/simple_proxy.md index 9659284c7..5a823300d 100644 --- a/docs/my-website/docs/simple_proxy.md +++ b/docs/my-website/docs/simple_proxy.md @@ -10,7 +10,7 @@ A simple, fast, and lightweight **OpenAI-compatible server** to call 100+ LLM AP - `/chat/completions` - chat completions endpoint to call 100+ LLMs - `/models` - available models on server -[![Deploy](https://deploy.cloud.run/button.svg)](https://deploy.cloud.run?git_repo=https://github.com/BerriAI/litellm) +[![Deploy](https://deploy.cloud.run/button.svg)](https://l.linklyhq.com/l/1uHtX) :::info We want to learn how we can make the proxy better! Meet the [founders](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version) or @@ -71,7 +71,7 @@ Looking for the CLI tool/local proxy? It's [here](./proxy_server.md) ## Deploy on Google Cloud Run **Click the button** to deploy to Google Cloud Run -[![Deploy](https://deploy.cloud.run/button.svg)](https://deploy.cloud.run?git_repo=https://github.com/BerriAI/litellm) +[![Deploy](https://deploy.cloud.run/button.svg)](https://l.linklyhq.com/l/1uHtX) On a successfull deploy your Cloud Run Shell will have this output diff --git a/openai-proxy/README.md b/openai-proxy/README.md index cae8962d2..ab13fe5ca 100644 --- a/openai-proxy/README.md +++ b/openai-proxy/README.md @@ -6,7 +6,7 @@ A simple, fast, and lightweight **OpenAI-compatible server** to call 100+ LLM AP - +