litellm/docs/my-website
Ishaan Jaff c047d51cc8
(feat) add Predicted Outputs for OpenAI (#6594)
* bump openai to openai==1.54.0

* add 'prediction' param

* testing fix bedrock deprecated cohere.command-text-v14

* test test_openai_prediction_param.py

* test_openai_prediction_param_with_caching

* doc Predicted Outputs

* doc Predicted Output
2024-11-04 21:16:57 -08:00
..
blog/2021-08-26-welcome Update index.md 2023-10-21 12:22:41 +05:30
docs (feat) add Predicted Outputs for OpenAI (#6594) 2024-11-04 21:16:57 -08:00
img docs(argilla.md): add doc on argilla logging 2024-10-17 22:51:55 -07:00
src remove ask mode (#6271) 2024-10-16 22:01:50 -07:00
static v1 2023-08-17 15:31:20 -07:00
.gitignore updating docs 2023-08-12 11:30:32 -07:00
babel.config.js updating docs 2023-08-12 11:30:32 -07:00
Dockerfile (docs) new dockerfile for litellm proxy 2023-11-17 17:39:07 -08:00
docusaurus.config.js update (#6160) 2024-10-11 19:18:56 +05:30
index.md fix keys 2023-08-17 16:13:52 -07:00
package-lock.json build(deps): bump cookie and express in /docs/my-website (#6566) 2024-11-04 07:22:54 -08:00
package.json bump (#6187) 2024-10-14 18:22:54 +05:30
README.md updating docs 2023-08-12 11:30:32 -07:00
sidebars.js (feat) add Predicted Outputs for OpenAI (#6594) 2024-11-04 21:16:57 -08:00

Website

This website is built using Docusaurus 2, a modern static website generator.

Installation

$ yarn

Local Development

$ yarn start

This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.

Build

$ yarn build

This command generates static content into the build directory and can be served using any static contents hosting service.

Deployment

Using SSH:

$ USE_SSH=true yarn deploy

Not using SSH:

$ GIT_USER=<Your GitHub username> yarn deploy

If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the gh-pages branch.