forked from phoenix/litellm-mirror
add everyting for docs
This commit is contained in:
parent
de45a738ee
commit
0fe8799f94
1015 changed files with 185353 additions and 0 deletions
30
docs/snippets/modules/chains/how_to/debugging.mdx
Normal file
30
docs/snippets/modules/chains/how_to/debugging.mdx
Normal file
|
@ -0,0 +1,30 @@
|
|||
Setting `verbose` to `True` will print out some internal states of the `Chain` object while it is being ran.
|
||||
|
||||
```python
|
||||
conversation = ConversationChain(
|
||||
llm=chat,
|
||||
memory=ConversationBufferMemory(),
|
||||
verbose=True
|
||||
)
|
||||
conversation.run("What is ChatGPT?")
|
||||
```
|
||||
|
||||
<CodeOutputBlock lang="python">
|
||||
|
||||
```
|
||||
> Entering new ConversationChain chain...
|
||||
Prompt after formatting:
|
||||
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
|
||||
|
||||
Current conversation:
|
||||
|
||||
Human: What is ChatGPT?
|
||||
AI:
|
||||
|
||||
> Finished chain.
|
||||
|
||||
'ChatGPT is an AI language model developed by OpenAI. It is based on the GPT-3 architecture and is capable of generating human-like responses to text prompts. ChatGPT has been trained on a massive amount of text data and can understand and respond to a wide range of topics. It is often used for chatbots, virtual assistants, and other conversational AI applications.'
|
||||
```
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
25
docs/snippets/modules/chains/how_to/memory.mdx
Normal file
25
docs/snippets/modules/chains/how_to/memory.mdx
Normal file
|
@ -0,0 +1,25 @@
|
|||
```python
|
||||
from langchain.chains import ConversationChain
|
||||
from langchain.memory import ConversationBufferMemory
|
||||
|
||||
conversation = ConversationChain(
|
||||
llm=chat,
|
||||
memory=ConversationBufferMemory()
|
||||
)
|
||||
|
||||
conversation.run("Answer briefly. What are the first 3 colors of a rainbow?")
|
||||
# -> The first three colors of a rainbow are red, orange, and yellow.
|
||||
conversation.run("And the next 4?")
|
||||
# -> The next four colors of a rainbow are green, blue, indigo, and violet.
|
||||
```
|
||||
|
||||
<CodeOutputBlock lang="python">
|
||||
|
||||
```
|
||||
'The next four colors of a rainbow are green, blue, indigo, and violet.'
|
||||
```
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
Essentially, `BaseMemory` defines an interface of how `langchain` stores memory. It allows reading of stored data through `load_memory_variables` method and storing new data through `save_context` method. You can learn more about it in the [Memory](/docs/modules/memory/) section.
|
||||
|
Loading…
Add table
Add a link
Reference in a new issue