code cleanup

This commit is contained in:
Krrish Dholakia 2023-08-07 10:44:02 -07:00
parent 10bb5f80df
commit 5e564ef5de
5 changed files with 16 additions and 3 deletions

View file

@ -7,7 +7,7 @@ a light 100 line package to simplify calling OpenAI, Azure, Cohere, Anthropic AP
* Consistent Exceptions for all LLM APIs, we map RateLimit, Context Window, and Authentication Error exceptions across all providers to their OpenAI equivalents. [see Code](https://github.com/BerriAI/litellm/blob/ba1079ff6698ef238c5c7f771dd2b698ec76f8d9/litellm/utils.py#L250)
###### observability:
* Logging - see exactly what the raw model request/response is `completion(.., logger_fn=your_logging_fn)`
* Logging - see exactly what the raw model request/response is by plugging in your own function `completion(.., logger_fn=your_logging_fn)` and/or print statements from the package `litellm.set_verbose=True`
* Callbacks - automatically send your data to Helicone, Sentry, Posthog, Slack - `litellm.success_callbacks`, `litellm.failure_callbacks` [see Callbacks](https://litellm.readthedocs.io/en/latest/advanced/)
## Quick Start

12
docs/output.md Normal file
View file

@ -0,0 +1,12 @@
# Completion Function - completion()
Here's the exact json output you can expect from a `litellm` completion call:
```python
{'choices': [{'finish_reason': 'stop',
'index': 0,
'message': {'role': 'assistant',
'content': " I'm doing well, thank you for asking. I am Claude, an AI assistant created by Anthropic."}}],
'created': 1691429984.3852863,
'model': 'claude-instant-1',
'usage': {'prompt_tokens': 18, 'completion_tokens': 23, 'total_tokens': 41}}
```

View file

@ -229,7 +229,7 @@ def completion(
prompt_tokens = anthropic.count_tokens(prompt)
completion_tokens = anthropic.count_tokens(completion_response)
## RESPONSE OBJECT
print(f"model_response: {model_response}")
print_verbose(f"raw model_response: {model_response}")
model_response["choices"][0]["message"]["content"] = completion_response
model_response["created"] = time.time()
model_response["model"] = model

View file

@ -4,6 +4,7 @@ nav:
- Installation & Quick Start: index.md
- completion():
- Input - Request Body: input.md
- Output - Response Object: output.md
- Streaming & Async Calls: stream.md
- 🤖 Supported LLM APIs:
- Supported Completion & Chat APIs: supported.md

View file

@ -1,6 +1,6 @@
[tool.poetry]
name = "litellm"
version = "0.1.351"
version = "0.1.352"
description = "Library to easily interface with LLM API providers"
authors = ["BerriAI"]
license = "MIT License"