Skip to content

🐛 BUG: LLM Chat template does not implement SSE streaming as claimed #789

@dev2xl

Description

@dev2xl

Which template does this pertain to?

llm-chat-app-template

What versions are you using?

Chrome Version 140.0.7339.214

What operating system and version are you using?

Mac Sonoma 14.7.1

Please provide a link to a minimal reproduction (optional)

No response

Describe the Bug

The template claims to stream the AI model’s response using SSE, but the source code doesn’t use the EventSource class. Even on the demo page, the text appears all at once—regardless of length—instead of streaming in real time, piece by piece.

Please provide any relevant error logs

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions