ai-lc4j-demos/demo-03/README.md

223 B

Demo 03 - Streaming

LLM responses can be long

Chunks

Multi<String> chat(String userMessage) ndicate that the response will be streamed instead of returned synchronously.