ai-lc4j-demos/demo-02/README.md

14 lines
No EOL
629 B
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

Demo 02 - LLM configuration
===============================================
In this step, we will play with various configurations of the language model (LLM)
# Temperature
`quarkus.langchain4j.openai.chat-model.temperature` controls the randomness of the models responses.
Lowering the temperature will make the model more conservative, while increasing it will make it more creative.
# Max tokens
`quarkus.langchain4j.openai.chat-model.max-tokens` limits the length of the response.
# Frequency penalty
`quarkus.langchain4j.openai.chat-model.frequency-penalty` defines how much the model should avoid repeating itself.