mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
Pre-Submission checklist
This commit is contained in:
parent
03a91fe066
commit
ce35240273
3 changed files with 7 additions and 7 deletions
4
.github/pull_request_template.md
vendored
4
.github/pull_request_template.md
vendored
|
@ -10,9 +10,9 @@
|
|||
|
||||
**Please complete all items before asking a LiteLLM maintainer to review your PR**
|
||||
|
||||
- [ ] I have Added testing in the `tests/litellm/` directory, **Adding at least 1 test is a hard requirement** - [see details](https://docs.litellm.ai/docs/contributing#2-adding-testing-to-your-pr)
|
||||
- [ ] I have Added testing in the `tests/litellm/` directory, **Adding at least 1 test is a hard requirement** - [see details](https://docs.litellm.ai/docs/extras/contributing_code)
|
||||
- [ ] I have added a screenshot of my new test passing locally
|
||||
- [ ] My PR passes all unit tests on `make unit-test` [https://docs.litellm.ai/docs/contributing]
|
||||
- [ ] My PR passes all unit tests on (`make unit-test`)[https://docs.litellm.ai/docs/extras/contributing_code]
|
||||
- [ ] My PR's scope is as isolated as possible, it only solves 1 specific problem
|
||||
|
||||
|
||||
|
|
|
@ -340,7 +340,7 @@ curl 'http://0.0.0.0:4000/key/generate' \
|
|||
|
||||
## Contributing
|
||||
|
||||
Interested in contributing? Contributions to LiteLLM Python SDK, Proxy Server, and contributing LLM integrations are both accepted and highly encouraged! [See our Contribution Guide for more details](https://docs.litellm.ai/docs/contributing)
|
||||
Interested in contributing? Contributions to LiteLLM Python SDK, Proxy Server, and contributing LLM integrations are both accepted and highly encouraged! [See our Contribution Guide for more details](https://docs.litellm.ai/docs/extras/contributing_code)
|
||||
|
||||
# Enterprise
|
||||
For companies that need better security, user management and professional support
|
||||
|
|
|
@ -1994,8 +1994,8 @@
|
|||
"max_tokens": 8191,
|
||||
"max_input_tokens": 32000,
|
||||
"max_output_tokens": 8191,
|
||||
"input_cost_per_token": 0.000001,
|
||||
"output_cost_per_token": 0.000003,
|
||||
"input_cost_per_token": 0.0000001,
|
||||
"output_cost_per_token": 0.0000003,
|
||||
"litellm_provider": "mistral",
|
||||
"supports_function_calling": true,
|
||||
"mode": "chat",
|
||||
|
@ -2006,8 +2006,8 @@
|
|||
"max_tokens": 8191,
|
||||
"max_input_tokens": 32000,
|
||||
"max_output_tokens": 8191,
|
||||
"input_cost_per_token": 0.000001,
|
||||
"output_cost_per_token": 0.000003,
|
||||
"input_cost_per_token": 0.0000001,
|
||||
"output_cost_per_token": 0.0000003,
|
||||
"litellm_provider": "mistral",
|
||||
"supports_function_calling": true,
|
||||
"mode": "chat",
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue