forked from phoenix/litellm-mirror
trying to add docs
This commit is contained in:
parent
0fe8799f94
commit
2cf949990e
834 changed files with 0 additions and 161273 deletions
|
@ -1,87 +0,0 @@
|
|||
#### Using `LLMChain`
|
||||
|
||||
The `LLMChain` is most basic building block chain. It takes in a prompt template, formats it with the user input and returns the response from an LLM.
|
||||
|
||||
To use the `LLMChain`, first create a prompt template.
|
||||
|
||||
```python
|
||||
from langchain.llms import OpenAI
|
||||
from langchain.prompts import PromptTemplate
|
||||
|
||||
llm = OpenAI(temperature=0.9)
|
||||
prompt = PromptTemplate(
|
||||
input_variables=["product"],
|
||||
template="What is a good name for a company that makes {product}?",
|
||||
)
|
||||
```
|
||||
|
||||
We can now create a very simple chain that will take user input, format the prompt with it, and then send it to the LLM.
|
||||
|
||||
|
||||
```python
|
||||
from langchain.chains import LLMChain
|
||||
chain = LLMChain(llm=llm, prompt=prompt)
|
||||
|
||||
# Run the chain only specifying the input variable.
|
||||
print(chain.run("colorful socks"))
|
||||
```
|
||||
|
||||
<CodeOutputBlock lang="python">
|
||||
|
||||
```
|
||||
Colorful Toes Co.
|
||||
```
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
If there are multiple variables, you can input them all at once using a dictionary.
|
||||
|
||||
|
||||
```python
|
||||
prompt = PromptTemplate(
|
||||
input_variables=["company", "product"],
|
||||
template="What is a good name for {company} that makes {product}?",
|
||||
)
|
||||
chain = LLMChain(llm=llm, prompt=prompt)
|
||||
print(chain.run({
|
||||
'company': "ABC Startup",
|
||||
'product': "colorful socks"
|
||||
}))
|
||||
```
|
||||
|
||||
<CodeOutputBlock lang="python">
|
||||
|
||||
```
|
||||
Socktopia Colourful Creations.
|
||||
```
|
||||
|
||||
</CodeOutputBlock>
|
||||
|
||||
You can use a chat model in an `LLMChain` as well:
|
||||
|
||||
|
||||
```python
|
||||
from langchain.chat_models import ChatOpenAI
|
||||
from langchain.prompts.chat import (
|
||||
ChatPromptTemplate,
|
||||
HumanMessagePromptTemplate,
|
||||
)
|
||||
human_message_prompt = HumanMessagePromptTemplate(
|
||||
prompt=PromptTemplate(
|
||||
template="What is a good name for a company that makes {product}?",
|
||||
input_variables=["product"],
|
||||
)
|
||||
)
|
||||
chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])
|
||||
chat = ChatOpenAI(temperature=0.9)
|
||||
chain = LLMChain(llm=chat, prompt=chat_prompt_template)
|
||||
print(chain.run("colorful socks"))
|
||||
```
|
||||
|
||||
<CodeOutputBlock lang="python">
|
||||
|
||||
```
|
||||
Rainbow Socks Co.
|
||||
```
|
||||
|
||||
</CodeOutputBlock>
|
Loading…
Add table
Add a link
Reference in a new issue