Marketplace
ml-inference-optimization
ML inference latency optimization, model compression, distillation, caching strategies, and edge deployment patterns. Use when optimizing inference performance, reducing model size, or deploying ML at the edge.
$ 安裝
git clone https://github.com/melodic-software/claude-code-plugins /tmp/claude-code-plugins && cp -r /tmp/claude-code-plugins/plugins/systems-design/skills/ml-inference-optimization ~/.claude/skills/claude-code-plugins// tip: Run this command in your terminal to install the skill
Repository

melodic-software
Author
melodic-software/claude-code-plugins/plugins/systems-design/skills/ml-inference-optimization
3
Stars
0
Forks
Updated1w ago
Added1w ago