forked from phoenix/litellm-mirror
(docs) bump langfuse to >= 2.0.0
This commit is contained in:
parent
2d5801b69e
commit
05b9b3aacd
3 changed files with 3 additions and 3 deletions
|
@ -15,7 +15,7 @@ join our [discord](https://discord.gg/wuPM9dRgDw)
|
||||||
## Pre-Requisites
|
## Pre-Requisites
|
||||||
Ensure you have run `pip install langfuse` for this integration
|
Ensure you have run `pip install langfuse` for this integration
|
||||||
```shell
|
```shell
|
||||||
pip install langfuse==1.14.0 litellm
|
pip install langfuse>=2.0.0 litellm
|
||||||
```
|
```
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
|
@ -461,7 +461,7 @@ We will use the `--config` to set `litellm.success_callback = ["langfuse"]` this
|
||||||
**Step 1** Install langfuse
|
**Step 1** Install langfuse
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
pip install langfuse==1.14.0
|
pip install langfuse>=2.0.0
|
||||||
```
|
```
|
||||||
|
|
||||||
**Step 2**: Create a `config.yaml` file and set `litellm_settings`: `success_callback`
|
**Step 2**: Create a `config.yaml` file and set `litellm_settings`: `success_callback`
|
||||||
|
|
|
@ -13,7 +13,7 @@ prisma==0.11.0 # for db
|
||||||
mangum==0.17.0 # for aws lambda functions
|
mangum==0.17.0 # for aws lambda functions
|
||||||
google-generativeai==0.1.0 # for vertex ai calls
|
google-generativeai==0.1.0 # for vertex ai calls
|
||||||
traceloop-sdk==0.5.3 # for open telemetry logging
|
traceloop-sdk==0.5.3 # for open telemetry logging
|
||||||
langfuse==1.14.0 # for langfuse self-hosted logging
|
langfuse>=2.0.0# for langfuse self-hosted logging
|
||||||
### LITELLM PACKAGE DEPENDENCIES
|
### LITELLM PACKAGE DEPENDENCIES
|
||||||
python-dotenv>=0.2.0 # for env
|
python-dotenv>=0.2.0 # for env
|
||||||
tiktoken>=0.4.0 # for calculating usage
|
tiktoken>=0.4.0 # for calculating usage
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue