ai-lc4j-demos/demo-03
2025-03-28 17:53:29 +01:00
..
src/main chore: fix model URLs 2025-03-28 17:53:29 +01:00
pom.xml chore: Add README.md files 2025-03-28 17:36:22 +01:00
README.md chore: Add README.md files 2025-03-28 17:36:22 +01:00

Demo 03 - Streaming

LLM responses can be long

Chunks

Multi<String> chat(String userMessage) ndicate that the response will be streamed instead of returned synchronously.