streaming-llm-responses
Implement real-time streaming UI patterns for AI chat applications. Use when adding responselifecycle handlers, progress indicators, client effects, or thread state synchronization.Covers onResponseStart/End, onEffect, ProgressUpdateEvent, and client tools.NOT when building basic chat without real-time feedback.
$ Installer
git clone https://github.com/mjunaidca/mjs-agent-skills /tmp/mjs-agent-skills && cp -r /tmp/mjs-agent-skills/.claude/skills/streaming-llm-responses ~/.claude/skills/mjs-agent-skills// tip: Run this command in your terminal to install the skill
Repository

mjunaidca
Author
mjunaidca/mjs-agent-skills/.claude/skills/streaming-llm-responses
1
Stars
2
Forks
Updated1w ago
Added1w ago