Which template does this pertain to?
llm-chat-app-template
What versions are you using?
Chrome Version 140.0.7339.214
What operating system and version are you using?
Mac Sonoma 14.7.1
Please provide a link to a minimal reproduction (optional)
No response
Describe the Bug
The template claims to stream the AI model’s response using SSE, but the source code doesn’t use the EventSource class. Even on the demo page, the text appears all at once—regardless of length—instead of streaming in real time, piece by piece.
Please provide any relevant error logs
No response